23 resultados para Unconstrained minimization

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

By using a deterministic approach, an exact form for the synchronous detected video signal under a ghosted condition is presented. Information regarding the phase quadrature-induced ghost component derived from the quadrature forming nature of the vestigial sideband (VSB) filter is obtained by crosscorrelating the detected video with the ghost cancel reference (GCR) signal. As a result, the minimum number of taps required to correctly remove all the ghost components is subsequently presented. The results are applied to both National Television System Committee (NTSC) and phase alternate line (PAL) television.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper analyzes the performance of the unconstrained filtered-x LMS (FxLMS) algorithm for active noise control (ANC), where we remove the constraints on the controller that it must be causal and has finite impulse response. It is shown that the unconstrained FxLMS algorithm always converges to, if stable, the true optimum filter, even if the estimation of the secondary path is not perfect, and its final mean square error is independent of the secondary path. Moreover, we show that the sufficient and necessary stability condition for the feedforward unconstrained FxLMS is that the maximum phase error of the secondary path estimation must be within 90°, which is the only necessary condition for the feedback unconstrained FxLMS. The significance of the analysis on a practical system is also discussed. Finally we show how the obtained results can guide us to design a robust feedback ANC headset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The frequency responses of two 50 Hz and one 400 Hz induction machines have been measured experimentally over a frequency range of 1 kHz to 400 kHz. This study has shown that the stator impedances of the machines behave in a similar manner to a parallel resonant circuit, and hence have a resonant point at which the Input impedance of the machine is at a maximum. This maximum impedance point was found experimentally to be as low as 33 kHz, which is well within the switching frequency ranges of modern inverter drives. This paper investigates the possibility of exploiting the maximum impedance point of the machine, by taking it into consideration when designing an inverter, in order to minimize ripple currents due to the switching frequency. Minimization of the ripple currents would reduce torque pulsation and losses, increasing overall performance. A modified machine model was developed to take into account the resonant point, and this model was then simulated with an inverter to demonstrate the possible advantages of matching the inverter switching frequency to the resonant point. Finally, in order to experimentally verify the simulated results, a real inverter with a variable switching frequency was used to drive an induction machine. Experimental results are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Here we make an initial step toward the development of an ocean assimilation system that can constrain the modelled Atlantic Meridional Overturning Circulation (AMOC) to support climate predictions. A detailed comparison is presented of 1° and 1/4° resolution global model simulations with and without sequential data assimilation, to the observations and transport estimates from the RAPID mooring array across 26.5° N in the Atlantic. Comparisons of modelled water properties with the observations from the merged RAPID boundary arrays demonstrate the ability of in situ data assimilation to accurately constrain the east-west density gradient between these mooring arrays. However, the presence of an unconstrained "western boundary wedge" between Abaco Island and the RAPID mooring site WB2 (16 km offshore) leads to the intensification of an erroneous southwards flow in this region when in situ data are assimilated. The result is an overly intense southward upper mid-ocean transport (0–1100 m) as compared to the estimates derived from the RAPID array. Correction of upper layer zonal density gradients is found to compensate mostly for a weak subtropical gyre circulation in the free model run (i.e. with no assimilation). Despite the important changes to the density structure and transports in the upper layer imposed by the assimilation, very little change is found in the amplitude and sub-seasonal variability of the AMOC. This shows that assimilation of upper layer density information projects mainly on the gyre circulation with little effect on the AMOC at 26° N due to the absence of corrections to density gradients below 2000 m (the maximum depth of Argo). The sensitivity to initial conditions was explored through two additional experiments using a climatological initial condition. These experiments showed that the weak bias in gyre intensity in the control simulation (without data assimilation) develops over a period of about 6 months, but does so independently from the overturning, with no change to the AMOC. However, differences in the properties and volume transport of North Atlantic Deep Water (NADW) persisted throughout the 3 year simulations resulting in a difference of 3 Sv in AMOC intensity. The persistence of these dense water anomalies and their influence on the AMOC is promising for the development of decadal forecasting capabilities. The results suggest that the deeper waters must be accurately reproduced in order to constrain the AMOC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method is discussed for imposing any desired constraint on the force field obtained in a force constant refinement calculation. The application of this method to force constant refinement calculations for the methyl halide molecules is reported. All available data on the vibration frequencies, Coriolis interaction constants and centrifugal stretching constants of CH3X and CD3X molecules were used in the refinements, but despite this apparent abundance of data it was found that constraints were necessary in order to obtain a unique solution to the force field. The results of unconstrained calculations, and of three different constrained calculations, are reported in this paper. The constrained models reported are a Urey—Bradley force field, a modified valence force field, and a constraint based on orbital-following bond-hybridization arguments developed in the following paper. The results are discussed, and compared with previous results for these molecules. The third of the above models is found to reproduce the observed data better than either of the first two, and additional reasons are given for preferring this solution to the force field for the methyl halide molecules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 × 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture–recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture–recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quasi-Newton-Raphson minimization and conjugate gradient minimization have been used to solve the crystal structures of famotidine form B and capsaicin from X-ray powder diffraction data and characterize the chi(2) agreement surfaces. One million quasi-Newton-Raphson minimizations found the famotidine global minimum with a frequency of ca 1 in 5000 and the capsaicin global minimum with a frequency of ca 1 in 10 000. These results, which are corroborated by conjugate gradient minimization, demonstrate the existence of numerous pathways from some of the highest points on these chi(2) agreement surfaces to the respective global minima, which are passable using only downhill moves. This important observation has significant ramifications for the development of improved structure determination algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the largest contributions to biologically available nitrogen comes from the reduction of N-2 to ammonia by rhizobia in symbiosis with legumes. Plants supply dicarboxylic acids as a carbon source to bacteroids, and in return they receive ammonia. However, metabolic exchange must be more complex, because effective N-2 fixation by Rhizobium leguminosarum bv viciae bacteroids requires either one of two broad-specificity amino acid ABC transporters (Aap and Bra). It was proposed that amino acids cycle between plant and bacteroids, but the model was unconstrained because of the broad solute specificity of Aap and Bra. Here, we constrain the specificity of Bra and ectopically express heterologous transporters to demonstrate that branched-chain amino acid (LIV) transport is essential for effective N-2 fixation. This dependence of bacteroids on the plant for LIV is not due to their known down-regulation of glutamate synthesis, because ectopic expression of glutamate dehydrogenase did not rescue effective N-2 fixation. Instead, the effect is specific to LIV and is accompanied by a major reduction in transcription and activity of LIV biosynthetic enzymes. Bacteroids become symbiotic auxotrophs for LIV and depend on the plant for their supply. Bacteroids with aap bra null mutations are reduced in number, smaller, and have a lower DNA content than wild type. Plants control LIV supply to bacteroids, regulating their development and persistence. This makes it a critical control point for regulation of symbiosis. MICROBIOLOGY

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 x 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture-recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture-recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Under low latitude conditions, minimization of solar radiation within the urban environment may often be a desirable criterion in urban design. The dominance of the direct component of the global solar irradiance under clear high sun conditions requires that the street solar access must be small. It is well known that the size and proportion of open spaces has a great influence on the urban microclimate This paper is directed towards finding the interaction between urban canyon geometry and incident solar radiation. The effect of building height and street width on the shading of the street surfaces and ground for different orientations have been examined and evaluated. It is aimed to explore the extent to which these parameters affect the temperature in the street. This work is based on air and surface temperature measurements taken in different urban street canyons in EL-Oued City (hot and and climate), Algeria. In general, the results show that there are less air temperature variations compared to the surface temperature which really depends on the street geometry and sky view factor. In other words, there is a big correlation between the street geometry, sky view factor and surface temperatures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Implementations of incremental variational data assimilation require the iterative minimization of a series of linear least-squares cost functions. The accuracy and speed with which these linear minimization problems can be solved is determined by the condition number of the Hessian of the problem. In this study, we examine how different components of the assimilation system influence this condition number. Theoretical bounds on the condition number for a single parameter system are presented and used to predict how the condition number is affected by the observation distribution and accuracy and by the specified lengthscales in the background error covariance matrix. The theoretical results are verified in the Met Office variational data assimilation system, using both pseudo-observations and real data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates the determinants of commercial and retail airport revenues as well as revenues from real estate operations. Cross-sectional OLS, 2SLS and robust regression models of European airports identify a number of significant drivers of airport revenues. Aviation revenues per passenger are mainly determined by the national income per capita in which the airport is located, the percentage of leisure travelers and the size of the airport proxied by total aviation revenues. Main drivers of commercial revenues per passenger include the total number of passengers passing through the airport, the ratio of commercial to total revenues, the national income, the share of domestic and leisure travelers and the total number of flights. These results are in line with previous findings of a negative influence of business travelers on commercial revenues per passenger. We also find that a high amount of retail space per passenger is generally associated with lower commercial revenues per square meter confirming decreasing marginal revenue effects. Real estate revenues per passenger are positively associated with national income per capita at airport location, share of intra-EU passengers and percent delayed flights. Overall, aviation and non-aviation revenues appear to be strongly interlinked, underlining the potential for a comprehensive airport management strategy above and beyond mere cost minimization of the aviation sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Markowitz showed that assets can be combined to produce an 'Efficient' portfolio that will give the highest level of portfolio return for any level of portfolio risk, as measured by the variance or standard deviation. These portfolios can then be connected to generate what is termed an 'Efficient Frontier' (EF). In this paper we discuss the calculation of the Efficient Frontier for combinations of assets, again using the spreadsheet Optimiser. To illustrate the derivation of the Efficient Frontier, we use the data from the Investment Property Databank Long Term Index of Investment Returns for the period 1971 to 1993. Many investors might require a certain specific level of holding or a restriction on holdings in at least some of the assets. Such additional constraints may be readily incorporated into the model to generate a constrained EF with upper and/or lower bounds. This can then be compared with the unconstrained EF to see whether the reduction in return is acceptable. To see the effect that these additional constraints may have, we adopt a fairly typical pension fund profile, with no more than 20% of the total held in Property. The paper shows that it is now relatively easy to use the Optimiser available in at least one spreadsheet (EXCEL) to calculate efficient portfolios for various levels of risk and return, both constrained and unconstrained, so as to be able to generate any number of Efficient Frontiers.