16 resultados para Unconstrained and convex optimization
em Aston University Research Archive
Resumo:
We present the optimization of power and spectral performances of the random DFB fiber laser using the balance equation set. The numerical results are in good in agreement with experiments. © 2012 OSA.
Resumo:
This paper investigates a cross-layer design approach for minimizing energy consumption and maximizing network lifetime (NL) of a multiple-source and single-sink (MSSS) WSN with energy constraints. The optimization problem for MSSS WSN can be formulated as a mixed integer convex optimization problem with the adoption of time division multiple access (TDMA) in medium access control (MAC) layer, and it becomes a convex problem by relaxing the integer constraint on time slots. Impacts of data rate, link access and routing are jointly taken into account in the optimization problem formulation. Both linear and planar network topologies are considered for NL maximization (NLM). With linear MSSS and planar single-source and single-sink (SSSS) topologies, we successfully use Karush-Kuhn-Tucker (KKT) optimality conditions to derive analytical expressions of the optimal NL when all nodes are exhausted simultaneously. The problem for planar MSSS topology is more complicated, and a decomposition and combination (D&C) approach is proposed to compute suboptimal solutions. An analytical expression of the suboptimal NL is derived for a small scale planar network. To deal with larger scale planar network, an iterative algorithm is proposed for the D&C approach. Numerical results show that the upper-bounds of the network lifetime obtained by our proposed optimization models are tight. Important insights into the NL and benefits of cross-layer design for WSN NLM are obtained.
Resumo:
In the last few years, significant advances have been made in understanding how a yeast cell responds to the stress of producing a recombinant protein, and how this information can be used to engineer improved host strains. The molecular biology of the expression vector, through the choice of promoter, tag and codon optimization of the target gene, is also a key determinant of a high-yielding protein production experiment. Recombinant Protein Production in Yeast: Methods and Protocols examines the process of preparation of expression vectors, transformation to generate high-yielding clones, optimization of experimental conditions to maximize yields, scale-up to bioreactor formats and disruption of yeast cells to enable the isolation of the recombinant protein prior to purification. Written in the highly successful Methods in Molecular Biology™ series format, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and key tips on troubleshooting and avoiding known pitfalls.
Resumo:
Estimation of economic relationships often requires imposition of constraints such as positivity or monotonicity on each observation. Methods to impose such constraints, however, vary depending upon the estimation technique employed. We describe a general methodology to impose (observation-specific) constraints for the class of linear regression estimators using a method known as constraint weighted bootstrapping. While this method has received attention in the nonparametric regression literature, we show how it can be applied for both parametric and nonparametric estimators. A benefit of this method is that imposing numerous constraints simultaneously can be performed seamlessly. We apply this method to Norwegian dairy farm data to estimate both unconstrained and constrained parametric and nonparametric models.
Resumo:
A 10 cm diameter four-stage Scheibel column with dispersed phase wetted packing sections has been constructed to study the hydrodynamics and mass transfer using the system toluene-acetone-water. The literature pertaining to the above extractor has been examined and the important phenomena such as droplet break-up and coalescence, mass transfer and backmixing have been reviewed. A critical analysis of the backmixing or axial mixing models and the corresponding techniques for parameter estimation was applied and an optimization technique based on Marquardt's algorithm was implemented. A single phase sampling technique was developed to estimate the acetone concentration profile in both phases along the column. Column flooding characteristics were investigated under various operating conditions and it was found that, when the impellers were located at about DI/5cm from the upper surface of the pads, the limiting flow rates increased with impeller speed. This unusual behaviour was explained in terms of the pumping effect created by the turbine impellers. Correlations were developed to predict Sauter mean drop diameters. A five-cell with backflow model was used to estimate the column performance (stage efficiency) and phases non-ideality (backflow parameters). Overall mass transfer coefficients were computed using the above model and compared with those calculated using the correlations based on single drop mechanism.
An efficient, approximate path-following algorithm for elastic net based nonlinear spike enhancement
Resumo:
Unwanted spike noise in a digital signal is a common problem in digital filtering. However, sometimes the spikes are wanted and other, superimposed, signals are unwanted, and linear, time invariant (LTI) filtering is ineffective because the spikes are wideband - overlapping with independent noise in the frequency domain. So, no LTI filter can separate them, necessitating nonlinear filtering. However, there are applications in which the noise includes drift or smooth signals for which LTI filters are ideal. We describe a nonlinear filter formulated as the solution to an elastic net regularization problem, which attenuates band-limited signals and independent noise, while enhancing superimposed spikes. Making use of known analytic solutions a novel, approximate path-following algorithm is given that provides a good, filtered output with reduced computational effort by comparison to standard convex optimization methods. Accurate performance is shown on real, noisy electrophysiological recordings of neural spikes.
Resumo:
Background The production of high yields of recombinant proteins is an enduring bottleneck in the post-genomic sciences that has yet to be addressed in a truly rational manner. Typically eukaryotic protein production experiments have relied on varying expression construct cassettes such as promoters and tags, or culture process parameters such as pH, temperature and aeration to enhance yields. These approaches require repeated rounds of trial-and-error optimization and cannot provide a mechanistic insight into the biology of recombinant protein production. We published an early transcriptome analysis that identified genes implicated in successful membrane protein production experiments in yeast. While there has been a subsequent explosion in such analyses in a range of production organisms, no one has yet exploited the genes identified. The aim of this study was to use the results of our previous comparative transcriptome analysis to engineer improved yeast strains and thereby gain an understanding of the mechanisms involved in high-yielding protein production hosts. Results We show that tuning BMS1 transcript levels in a doxycycline-dependent manner resulted in optimized yields of functional membrane and soluble protein targets. Online flow microcalorimetry demonstrated that there had been a substantial metabolic change to cells cultured under high-yielding conditions, and in particular that high yielding cells were more metabolically efficient. Polysome profiling showed that the key molecular event contributing to this metabolically efficient, high-yielding phenotype is a perturbation of the ratio of 60S to 40S ribosomal subunits from approximately 1:1 to 2:1, and correspondingly of 25S:18S ratios from 2:1 to 3:1. This result is consistent with the role of the gene product of BMS1 in ribosome biogenesis. Conclusion This work demonstrates the power of a rational approach to recombinant protein production by using the results of transcriptome analysis to engineer improved strains, thereby revealing the underlying biological events involved.
Resumo:
Understanding the structures and functions of membrane proteins is an active area of research within bioscience. Membrane proteins are key players in essential cellular processes such as the uptake of nutrients, the export of waste products, and the way in which cells communicate with their environment. It is therefore not surprising that membrane proteins are targeted by over half of all prescription drugs. Since most membrane proteins are not abundant in their native membranes, it is necessary to produce them in recombinant host cells to enable further structural and functional studies. Unfortunately, achieving the required yields of functional recombinant membrane proteins is still a bottleneck in contemporary bioscience. This has highlighted the need for defined and rational optimization strategies based upon experimental observation rather than relying on trial and error. We have published a transcriptome and subsequent genetic analysis that has identified genes implicated in high-yielding yeast cells. These results have highlighted a role for alterations to a cell's protein synthetic capacity in the production of high yields of recombinant membrane protein: paradoxically, reduced protein synthesis favors higher yields. These results highlight a potential bottleneck at the protein folding or translocation stage of protein production.
Resumo:
The slow down in the drug discovery pipeline is, in part, owing to a lack of structural and functional information available for new drug targets. Membrane proteins, the targets of well over 50% of marketed pharmaceuticals, present a particular challenge. As they are not naturally abundant, they must be produced recombinantly for the structural biology that is a prerequisite to structure-based drug design. Unfortunately, however, obtaining high yields of functional, recombinant membrane proteins remains a major bottleneck in contemporary bioscience. While repeated rounds of trial-and-error optimization have not (and cannot) reveal mechanistic details of the biology of recombinant protein production, examination of the host response has provided new insights. To this end, we published an early transcriptome analysis that identified genes implicated in high-yielding yeast cell factories, which has enabled the engineering of improved production strains. These advances offer hope that the bottleneck of membrane protein production can be relieved rationally.
Resumo:
Projects exposed to an uncertain environment must be adapted to deal with the effective integration of various planning elements and the optimization of project parameters. Time, cost, and quality are the prime objectives of a project that need to be optimized to fulfill the owner's goal. In an uncertain environment, there exist many other conflicting objectives that may also need to be optimized. These objectives are characterized by varying degrees of conflict. Moreover, an uncertain environment also causes several changes in the project plan throughout its life, demanding that the project plan be totally flexible. Goal programming (GP), a multiple criteria decision making technique, offers a good solution for this project planning problem. There the planning problem is considered from the owner's perspective, which leads to classifying the project up to the activity level. GP is applied separately at each level, and the formulated models are integrated through information flow. The flexibility and adaptability of the models lies in the ease of updating the model parameters at the required level through changing priorities and/or constraints and transmitting the information to other levels. The hierarchical model automatically provides integration among various element of planning. The proposed methodology is applied in this paper to plan a petroleum pipeline construction project, and its effectiveness is demonstrated.
Resumo:
Energy consumption has been a key concern of data gathering in wireless sensor networks. Previous research works show that modulation scaling is an efficient technique to reduce energy consumption. However, such technique will also impact on both packet delivery latency and packet loss, therefore, may result in adverse effects on the qualities of applications. In this paper, we study the problem of modulation scaling and energy-optimization. A mathematical model is proposed to analyze the impact of modulation scaling on the overall energy consumption, end-to-end mean delivery latency and mean packet loss rate. A centralized optimal management mechanism is developed based on the model, which adaptively adjusts the modulation levels to minimize energy consumption while ensuring the QoS for data gathering. Experimental results show that the management mechanism saves significant energy in all the investigated scenarios. Some valuable results are also observed in the experiments. © 2004 IEEE.
Resumo:
A generalized Drucker–Prager (GD–P) viscoplastic yield surface model was developed and validated for asphalt concrete. The GD–P model was formulated based on fabric tensor modified stresses to consider the material inherent anisotropy. A smooth and convex octahedral yield surface function was developed in the GD–P model to characterize the full range of the internal friction angles from 0° to 90°. In contrast, the existing Extended Drucker–Prager (ED–P) was demonstrated to be applicable only for a material that has an internal friction angle less than 22°. Laboratory tests were performed to evaluate the anisotropic effect and to validate the GD–P model. Results indicated that (1) the yield stresses of an isotropic yield surface model are greater in compression and less in extension than that of an anisotropic model, which can result in an under-prediction of the viscoplastic deformation; and (2) the yield stresses predicted by the GD–P model matched well with the experimental results of the octahedral shear strength tests at different normal and confining stresses. By contrast, the ED–P model over-predicted the octahedral yield stresses, which can lead to an under-prediction of the permanent deformation. In summary, the rutting depth of an asphalt pavement would be underestimated without considering anisotropy and convexity of the yield surface for asphalt concrete. The proposed GD–P model was demonstrated to be capable of overcoming these limitations of the existing yield surface models for the asphalt concrete.
Resumo:
It is a crucial task to evaluate the reliability of manufacturing process in product development process. Process reliability is a measurement of production ability of reconfigurable manufacturing system (RMS), which serves as an integrated performance indicator of the production process under specified technical constraints, including time, cost and quality. An integration framework of manufacturing process reliability evaluation is presented together with product development process. A mathematical model and algorithm based on universal generating function (UGF) is developed for calculating the reliability of manufacturing process with respect to task intensity and process capacity, which are both independent random variables. The rework strategies of RMS are analyzed under different task intensity based on process reliability is presented, and the optimization of rework strategies based on process reliability is discussed afterwards.
Resumo:
Principal component analysis (PCA) is well recognized in dimensionality reduction, and kernel PCA (KPCA) has also been proposed in statistical data analysis. However, KPCA fails to detect the nonlinear structure of data well when outliers exist. To reduce this problem, this paper presents a novel algorithm, named iterative robust KPCA (IRKPCA). IRKPCA works well in dealing with outliers, and can be carried out in an iterative manner, which makes it suitable to process incremental input data. As in the traditional robust PCA (RPCA), a binary field is employed for characterizing the outlier process, and the optimization problem is formulated as maximizing marginal distribution of a Gibbs distribution. In this paper, this optimization problem is solved by stochastic gradient descent techniques. In IRKPCA, the outlier process is in a high-dimensional feature space, and therefore kernel trick is used. IRKPCA can be regarded as a kernelized version of RPCA and a robust form of kernel Hebbian algorithm. Experimental results on synthetic data demonstrate the effectiveness of IRKPCA. © 2010 Taylor & Francis.
Resumo:
Product quality planning is a fundamental part of quality assurance in manufacturing. It is composed of the distribution of quality aims over each phase in product development and the deployment of quality operations and resources to accomplish these aims. This paper proposes a quality planning methodology based on risk assessment and the planning tasks of product development are translated into evaluation of risk priorities. Firstly, a comprehensive model for quality planning is developed to address the deficiencies of traditional quality function deployment (QFD) based quality planning. Secondly, a novel failure knowledge base (FKB) based method is discussed. Then a mathematical method and algorithm of risk assessment is presented for target decomposition, measure selection, and sequence optimization. Finally, the proposed methodology has been implemented in a web based prototype software system, QQ-Planning, to solve the problem of quality planning regarding the distribution of quality targets and the deployment of quality resources, in such a way that the product requirements are satisfied and the enterprise resources are highly utilized. © Springer-Verlag Berlin Heidelberg 2010.