907 resultados para Unconstrained minimization
Resumo:
La creciente preocupación y concienciación de la sociedad respecto el medio ambiente, y en consecuencia la legislación y regulaciones generadas inducen a la modificación de los procesos productivos existentes en la industria química. Las configuraciones iniciales deben modificarse para conseguir una mayor integración de procesos. Para este fin se han creado y desarrollado diferentes metodologías que deben facilitar la tarea a los responsables del rediseño. El desarrollo de una metodología y herramientas complementarias es el principal objetivo de la investigación aquí presentada, especialmente centrada en el desarrollo y la aplicación de una metodología de optimización de procesos. Esta metodología de optimización se aplica sobre configuraciones de proceso existentes y pretende encontrar nuevas configuraciones viables según los objetivos de optimización fijados. La metodología tiene dos partes diferenciadas: la primera se basa en un simulador de procesos comercial y la segunda es la técnica de optimización propiamente dicha. La metodología se inicia con la elaboración de una simulación convenientemente validada que reproduzca el proceso existente, en este caso una papelera no integrada que produce papel estucado de calidad, para impresión. A continuación la técnica de optimización realiza una búsqueda dentro del dominio de los posibles resultados, en busca de los mejores resultados que satisfazcan plenamente los objetivos planteados. Dicha técnica de optimización está basada en los algoritmos genéticos como herramienta de búsqueda, junto a un subprograma basado en técnicas de programación matemática para el cálculo de resultados. Un número reducido de resultados son finalmente escogidos y utilizados para modificar la simulación existente fijando la redistribución de los flujos del proceso. Los resultados de la simulación del proceso determinan en último caso la viabilidad técnica de cada reconfiguración planteada. En el proceso de optimización, los objetivos están definidos en una función objetivo dentro de la técnica de optimización. Dicha función rige la búsqueda de resultados. La función objetivo puede ser individual o una combinación de objetivos. En el presente caso, la función persigue una minimización del consumo de agua y una minimización de la pérdida de materia prima. La optimización se realiza bajo restricciones para alcanzar este objetivo combinado en forma de una solución de compromiso. Producto de la aplicación de esta metodología se han obtenido resultados interesantes que significan una mejora del cierre de circuitos y un ahorro de materia prima, sin comprometer al mismo tiempo la operabilidad del proceso producto ni la calidad del papel.
Resumo:
La Tesi descriu de manera completa una sèrie de complexos de ruteni amb lligands polipiridílics i lligands auxiliars de tipus fosfina, dmso, nitril o aquo. Es descriuen estudis d'isomerització (cis/trans o coordinació meridional/facial) en complexos mononuclears, a partir de tècniques espectroscòpiques. Els resultats experimentals es corroboren a partir de càlculs DFT. S'han fet també estudis d'activitat catalítica en transferència d'hidrogen per als complexos Ru-fosfina. S'han sintetitzat també complexos dinuclears de ruteni amb el lligand tetradentat Hbpp i s'ha avaluat llur activitat en la catàlisi d'oxidació d'aigua, determinant la importància de la correcta orientació relativa dels centres actius Ru=O. L'heterogeneïtzació dels complexos sobre suports conductors permet avaluar llur activitat en fase heterogènia, millorant respecte el corresponent procés en dissolució. La co-polimerització dels catalitzadors amb espècies de tipus metalocarborà, amb major dilució del catalitzador i minimització de la sobreoxidació, millora notablement els resultats, permetent diverses reutilitzacions.
Resumo:
L'objectiu d'aquesta tesi és l'estudi de les diferents tècniques per alinear vistes tridimensionals. Aquest estudi ens ha permès detectar els principals problemes de les tècniques existents, aprotant una solució novedosa i contribuint resolent algunes de les mancances detectades especialment en l'alineament de vistes a temps real. Per tal d'adquirir les esmentades vistes, s'ha dissenyat un sensor 3D manual que ens permet fer adquisicions tridimensionals amb total llibertat de moviments. Així mateix, s'han estudiat les tècniques de minimització global per tal de reduir els efectes de la propagació de l'error.
Resumo:
O trabalho apresentado é decorrente do Projeto de Intervenção realizado no âmbito do Curso de 2º Ciclo em Educação Especial – domínio cognitivo e motor, da Universidade Lusófona de Humanidades e Tecnologias. Resumo A referida intervenção contempla a minimização das dificuldades apresentadas por uma menina a nível da leitura e escrita e da sua socialização, numa perspetiva inclusiva. M. é o nome fictício da aluna alvo da intervenção. Atualmente, frequenta o 3º ano de escolaridade numa escola pública, em Lisboa área da sua residência. A revisão da literatura vai sustentar e facilitar a compreensão clara e concisa da intervenção realizada e das posições defendidas sobre esta matéria. Deste modo, são tratados temas no âmbito da exclusão social e escolar, da escola Inclusiva e dos obstáculos que ainda encontramos nas escolas, dos preconceitos, dos alunos com necessidades educativas especiais, das adaptações curriculares, da aprendizagem cooperativa e diferenciação pedagógica, referimo-nos ainda às dificuldades de aprendizagem e, por último, à comunicação e à linguagem oral e escrita. Para obter informações sobre a M. e sobre o contexto da intervenção, bem como sobre todo o seu processo de inclusão escolar, utilizamos como suporte metodológico, a pesquisa documental, as entrevistas semi-diretivas à professora titular de turma e à professora de ensino especial, a observação naturalista, a sociometria e as notas de campo para se poder complementar as informações. A planificação global da intervenção foi elaborada a partir do relacionamento/ cruzamento dos dados que resultaram da análise da informação recolhida. Para uma intervenção fundamentada caracterizamos inicialmente o seu contexto escolar e familiar e posteriormente a M. Os princípios orientadores da intervenção realizada, assentam numa perspetiva de investigação para a ação, e tiveram presentes os objetivos definidos para a M. As atividades foram realizadas, numa perspetiva de aprendizagem muito estruturada, muito refletida e avaliada durante todo o processo, implicando todos os intervenientes. Esta intervenção, levou-nos a estimular práticas educativas, diferenciadas e inclusivas na turma, com a professora titular dessa turma e com a professora do ensino especial com os colegas da M.
Resumo:
An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.
Resumo:
The frequency responses of two 50 Hz and one 400 Hz induction machines have been measured experimentally over a frequency range of 1 kHz to 400 kHz. This study has shown that the stator impedances of the machines behave in a similar manner to a parallel resonant circuit, and hence have a resonant point at which the Input impedance of the machine is at a maximum. This maximum impedance point was found experimentally to be as low as 33 kHz, which is well within the switching frequency ranges of modern inverter drives. This paper investigates the possibility of exploiting the maximum impedance point of the machine, by taking it into consideration when designing an inverter, in order to minimize ripple currents due to the switching frequency. Minimization of the ripple currents would reduce torque pulsation and losses, increasing overall performance. A modified machine model was developed to take into account the resonant point, and this model was then simulated with an inverter to demonstrate the possible advantages of matching the inverter switching frequency to the resonant point. Finally, in order to experimentally verify the simulated results, a real inverter with a variable switching frequency was used to drive an induction machine. Experimental results are presented.
Impact of hydrographic data assimilation on the modelled Atlantic meridional overturning circulation
Resumo:
Here we make an initial step toward the development of an ocean assimilation system that can constrain the modelled Atlantic Meridional Overturning Circulation (AMOC) to support climate predictions. A detailed comparison is presented of 1° and 1/4° resolution global model simulations with and without sequential data assimilation, to the observations and transport estimates from the RAPID mooring array across 26.5° N in the Atlantic. Comparisons of modelled water properties with the observations from the merged RAPID boundary arrays demonstrate the ability of in situ data assimilation to accurately constrain the east-west density gradient between these mooring arrays. However, the presence of an unconstrained "western boundary wedge" between Abaco Island and the RAPID mooring site WB2 (16 km offshore) leads to the intensification of an erroneous southwards flow in this region when in situ data are assimilated. The result is an overly intense southward upper mid-ocean transport (0–1100 m) as compared to the estimates derived from the RAPID array. Correction of upper layer zonal density gradients is found to compensate mostly for a weak subtropical gyre circulation in the free model run (i.e. with no assimilation). Despite the important changes to the density structure and transports in the upper layer imposed by the assimilation, very little change is found in the amplitude and sub-seasonal variability of the AMOC. This shows that assimilation of upper layer density information projects mainly on the gyre circulation with little effect on the AMOC at 26° N due to the absence of corrections to density gradients below 2000 m (the maximum depth of Argo). The sensitivity to initial conditions was explored through two additional experiments using a climatological initial condition. These experiments showed that the weak bias in gyre intensity in the control simulation (without data assimilation) develops over a period of about 6 months, but does so independently from the overturning, with no change to the AMOC. However, differences in the properties and volume transport of North Atlantic Deep Water (NADW) persisted throughout the 3 year simulations resulting in a difference of 3 Sv in AMOC intensity. The persistence of these dense water anomalies and their influence on the AMOC is promising for the development of decadal forecasting capabilities. The results suggest that the deeper waters must be accurately reproduced in order to constrain the AMOC.
Resumo:
A method is discussed for imposing any desired constraint on the force field obtained in a force constant refinement calculation. The application of this method to force constant refinement calculations for the methyl halide molecules is reported. All available data on the vibration frequencies, Coriolis interaction constants and centrifugal stretching constants of CH3X and CD3X molecules were used in the refinements, but despite this apparent abundance of data it was found that constraints were necessary in order to obtain a unique solution to the force field. The results of unconstrained calculations, and of three different constrained calculations, are reported in this paper. The constrained models reported are a Urey—Bradley force field, a modified valence force field, and a constraint based on orbital-following bond-hybridization arguments developed in the following paper. The results are discussed, and compared with previous results for these molecules. The third of the above models is found to reproduce the observed data better than either of the first two, and additional reasons are given for preferring this solution to the force field for the methyl halide molecules.
Resumo:
The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 × 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture–recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture–recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.
Resumo:
Quasi-Newton-Raphson minimization and conjugate gradient minimization have been used to solve the crystal structures of famotidine form B and capsaicin from X-ray powder diffraction data and characterize the chi(2) agreement surfaces. One million quasi-Newton-Raphson minimizations found the famotidine global minimum with a frequency of ca 1 in 5000 and the capsaicin global minimum with a frequency of ca 1 in 10 000. These results, which are corroborated by conjugate gradient minimization, demonstrate the existence of numerous pathways from some of the highest points on these chi(2) agreement surfaces to the respective global minima, which are passable using only downhill moves. This important observation has significant ramifications for the development of improved structure determination algorithms.
Resumo:
One of the largest contributions to biologically available nitrogen comes from the reduction of N-2 to ammonia by rhizobia in symbiosis with legumes. Plants supply dicarboxylic acids as a carbon source to bacteroids, and in return they receive ammonia. However, metabolic exchange must be more complex, because effective N-2 fixation by Rhizobium leguminosarum bv viciae bacteroids requires either one of two broad-specificity amino acid ABC transporters (Aap and Bra). It was proposed that amino acids cycle between plant and bacteroids, but the model was unconstrained because of the broad solute specificity of Aap and Bra. Here, we constrain the specificity of Bra and ectopically express heterologous transporters to demonstrate that branched-chain amino acid (LIV) transport is essential for effective N-2 fixation. This dependence of bacteroids on the plant for LIV is not due to their known down-regulation of glutamate synthesis, because ectopic expression of glutamate dehydrogenase did not rescue effective N-2 fixation. Instead, the effect is specific to LIV and is accompanied by a major reduction in transcription and activity of LIV biosynthetic enzymes. Bacteroids become symbiotic auxotrophs for LIV and depend on the plant for their supply. Bacteroids with aap bra null mutations are reduced in number, smaller, and have a lower DNA content than wild type. Plants control LIV supply to bacteroids, regulating their development and persistence. This makes it a critical control point for regulation of symbiosis. MICROBIOLOGY
Resumo:
The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 x 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture-recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture-recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.
Resumo:
Under low latitude conditions, minimization of solar radiation within the urban environment may often be a desirable criterion in urban design. The dominance of the direct component of the global solar irradiance under clear high sun conditions requires that the street solar access must be small. It is well known that the size and proportion of open spaces has a great influence on the urban microclimate This paper is directed towards finding the interaction between urban canyon geometry and incident solar radiation. The effect of building height and street width on the shading of the street surfaces and ground for different orientations have been examined and evaluated. It is aimed to explore the extent to which these parameters affect the temperature in the street. This work is based on air and surface temperature measurements taken in different urban street canyons in EL-Oued City (hot and and climate), Algeria. In general, the results show that there are less air temperature variations compared to the surface temperature which really depends on the street geometry and sky view factor. In other words, there is a big correlation between the street geometry, sky view factor and surface temperatures.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Conditioning of incremental variational data assimilation, with application to the Met Office system
Resumo:
Implementations of incremental variational data assimilation require the iterative minimization of a series of linear least-squares cost functions. The accuracy and speed with which these linear minimization problems can be solved is determined by the condition number of the Hessian of the problem. In this study, we examine how different components of the assimilation system influence this condition number. Theoretical bounds on the condition number for a single parameter system are presented and used to predict how the condition number is affected by the observation distribution and accuracy and by the specified lengthscales in the background error covariance matrix. The theoretical results are verified in the Met Office variational data assimilation system, using both pseudo-observations and real data.