11 resultados para line : identification

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In industrial practice, constrained steady state optimisation and predictive control are separate, albeit closely related functions within the control hierarchy. This paper presents a method which integrates predictive control with on-line optimisation with economic objectives. A receding horizon optimal control problem is formulated using linear state space models. This optimal control problem is very similar to the one presented in many predictive control formulations, but the main difference is that it includes in its formulation a general steady state objective depending on the magnitudes of manipulated and measured output variables. This steady state objective may include the standard quadratic regulatory objective, together with economic objectives which are often linear. Assuming that the system settles to a steady state operating point under receding horizon control, conditions are given for the satisfaction of the necessary optimality conditions of the steady-state optimisation problem. The method is based on adaptive linear state space models, which are obtained by using on-line identification techniques. The use of model adaptation is justified from a theoretical standpoint and its beneficial effects are shown in simulations. The method is tested with simulations of an industrial distillation column and a system of chemical reactors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to convert existing faba bean (Vicia faba L.) single nucleotide polymorphism (SNP) markers from cleaved amplification polymorphic sequences and SNaPshot® formats, which are expensive and time-consuming, to the more convenient KBiosciences competitive allele‐specific PCR (KASP) assay format. Out of 80 assays designed, 75 were validated, though a core set of 67 of the most robust markers is recommended for further use. The 67 best KASP SNP assays were used across two generations of single seed descent to detect unintended outcrossing and to track and quantify loss of heterozygosity, a capability that will significantly increase the efficiency and performance of pure line production and maintenance. This same set of assays was also used to examine genetic relationships between the 67 members of the partly inbred panel, and should prove useful for line identification and diversity studies in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The utility of plant secondary cell wall biomass for industrial and biofuel purposes depends upon improving cellulose amount, availability and extractability. The possibility of engineering such biomass requires much more knowledge of the genes and proteins involved in the synthesis, modification and assembly of cellulose, lignin and xylans. Proteomic data are essential to aid gene annotation and understanding of polymer biosynthesis. Comparative proteomes were determined for secondary walls of stem xylem and transgenic xylogenic cells of tobacco and detected peroxidase, cellulase, chitinase, pectinesterase and a number of defence/cell death related proteins, but not marker proteins of primary walls such as xyloglucan endotransglycosidase and expansins. Only the corresponding detergent soluble proteome of secretory microsomes from the xylogenic cultured cells, subjected to ion-exchange chromatography, could be determined accurately since, xylem-specific membrane yields were of poor quality from stem tissue. Among the 109 proteins analysed, many of the protein markers of the ER such as BiP, HSP70, calreticulin and calnexin were identified, together with some of the biosynthetic enzymes and associated polypeptides involved in polymer synthesis. However 53% of these endomembrane proteins failed identification despite the use of two different MS methods, leaving considerable possibilities for future identification of novel proteins involved in secondary wall polymer synthesis once full genomic data are available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We model the large scale fading of wireless THz communications links deployed in a metropolitan area taking into account reception through direct line of sight, ground or wall reflection and diffraction. The movement of the receiver in the three dimensions is modelled by an autonomous dynamic linear system in state-space whereas the geometric relations involved in the attenuation and multi-path propagation of the electric field are described by a static non-linear mapping. A subspace algorithm in conjunction with polynomial regression is used to identify a Wiener model from time-domain measurements of the field intensity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the feasibility of wireless terahertz communications links deployed in a metropolitan area and model the large-scale fading of such channels. The model takes into account reception through direct line of sight, ground and wall reflection, as well as diffraction around a corner. The movement of the receiver is modeled by an autonomous dynamic linear system in state space, whereas the geometric relations involved in the attenuation and multipath propagation of the electric field are described by a static nonlinear mapping. A subspace algorithm in conjunction with polynomial regression is used to identify a single-output Wiener model from time-domain measurements of the field intensity when the receiver motion is simulated using a constant angular speed and an exponentially decaying radius. The identification procedure is validated by using the model to perform q-step ahead predictions. The sensitivity of the algorithm to small-scale fading, detector noise, and atmospheric changes are discussed. The performance of the algorithm is tested in the diffraction zone assuming a range of emitter frequencies (2, 38, 60, 100, 140, and 400 GHz). Extensions of the simulation results to situations where a more complicated trajectory describes the motion of the receiver are also implemented, providing information on the performance of the algorithm under a worst case scenario. Finally, a sensitivity analysis to model parameters for the identified Wiener system is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid development of proteomics, a number of different methods appeared for the basic task of protein identification. We made a simple comparison between a common liquid chromatography-tandem mass spectrometry (LC-MS/MS) workflow using an ion trap mass spectrometer and a combined LC-MS and LC-MS/MS method using Fourier transform ion cyclotron resonance (FTICR) mass spectrometry and accurate peptide masses. To compare the two methods for protein identification, we grew and extracted proteins from E. coli using established protocols. Cystines were reduced and alkylated, and proteins digested by trypsin. The resulting peptide mixtures were separated by reversed-phase liquid chromatography using a 4 h gradient from 0 to 50% acetonitrile over a C18 reversed-phase column. The LC separation was coupled on-line to either a Bruker Esquire HCT ion trap or a Bruker 7 tesla APEX-Qe Qh-FTICR hybrid mass spectrometer. Data-dependent Qh-FTICR-MS/MS spectra were acquired using the quadrupole mass filter and collisionally induced dissociation into the external hexapole trap. Proteins were in both schemes identified by Mascot MS/MS ion searches and the peptides identified from these proteins in the FTICR MS/MS data were used for automatic internal calibration of the FTICR-MS data, together with ambient polydimethylcyclosiloxane ions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recursive least-squares algorithm with a forgetting factor has been extensively applied and studied for the on-line parameter estimation of linear dynamic systems. This paper explores the use of genetic algorithms to improve the performance of the recursive least-squares algorithm in the parameter estimation of time-varying systems. Simulation results show that the hybrid recursive algorithm (GARLS), combining recursive least-squares with genetic algorithms, can achieve better results than the standard recursive least-squares algorithm using only a forgetting factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By using a deterministic approach, an exact form for the synchronous detected video signal under a ghosted condition is presented. Information regarding the phase quadrature-induced ghost component derived from the quadrature forming nature of the vestigial sideband (VSB) filter is obtained by crosscorrelating the detected video with the ghost cancel reference (GCR) signal. As a result, the minimum number of taps required to correctly remove all the ghost components is subsequently presented. The results are applied to both National Television System Committee (NTSC) and phase alternate line (PAL) television.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method for estimating both the Alfvén speed and the field-aligned flow of the magnetosheath at the magnetopause reconnection site is presented. The method employs low-altitude cusp ion observations and requires the identification of a feature in the cusp ion spectra near the low-energy cutoff which will often be present for a low-latitude dayside reconnection site. The appearance of these features in data of limited temporal, energy, and pitch angle resolution is illustrated by using model calculations of cusp ion distribution functions. These are based on the theory of ion acceleration at the dayside magnetopause and allow for the effects on the spectrum of flight times of ions precipitating down newly opened field lines. In addition, the variation of the reconnection rate can be evaluated, and comparison with ground-based observations of the corresponding sequence of transient events allows the field-aligned distance from the ionosphere to the reconnection site to be estimated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an ongoing concern about the lack of reliable data on disabled children in schools. To date there has been no consistent way of identifying and categorising disabilities. Schools in England are currentlyrequired to collect data on children with Special Educational Need (SEN), but this does not capture information about all disabled children. The lack of this information may seriously restrict capacity at all levels of policy and practice to understand and respond to the needs of disabled children and their families in line with Disability Discrimination Act (2005) and the single Equality Act (2010). The aim of the project was to test the draft tools for identifying disability and accompanying guidance in a sample of all types of maintained schools in order to assess their usability and reliability and whether they resulted in the generation of robust and consistent data that could reliably inform school returns for the annual School Census.