951 resultados para Kuhn-Tucker type necessary optimality conditions
Resumo:
Tese de Doutoramento em Engenharia Industrial e de Sistemas (PDEIS)
Resumo:
Moral values infuence individual behavior and social interactions. A specially signif- cant instance is the case of moral values concerning work e¤ort. Individuals determine what they take to be proper behaviour and judge the others, and themselves, accordingly. They increase their esteem -and self-esteem- for those who perform in excess of the standard and decrease their esteem for those who work less. These changes in self-esteem result from the self-regulatory emotions of guilt or pride extensively studied in Social Psychology. We examine the interactions between sentiments, individual behaviour and the social contract in a model of rational voting over redistribution where individual self-esteem and relative es-teem for others are endogenously determined. Individuals di¤er in their productivities. The desired extent of redistribution depends both on individual income and on individual attitudes toward others. We characterize the politico-economic equilibria in which sentiments, labor supply and redistribution are simultaneously determined. The model has two types of equilibria. In "cohesive" equilibria, all individuals conform to the standard of proper behav- iour, income inequality is low and social esteem is not biased toward any particular type. Under these conditions equilibrium redistribution increases in response to larger inequality. In a "clustered" equilibrium skilled workers work above the mean while unskilled workers work below. In such an equilibrium, income inequality is large and sentiments are biased in favor of the industrious. As inequality increases, this bias may eventually overtake the egoistic demand for greater taxation and equilibrium redistribution decreases. The type of equilibrium that emerges crucially depends on inequality. We contrast the predictions of the model with data on inequality, redistribution, work values and attitudes toward work and toward the poor for a set of OECD countries.
Resumo:
We study the equidistribution of Fekete points in a compact complex manifold. These are extremal point configurations defined through sections of powers of a positive line bundle. Their equidistribution is a known result. The novelty of our approach is that we relate them to the problem of sampling and interpolation on line bundles, which allows us to estimate the equidistribution of the Fekete points quantitatively. In particular we estimate the Kantorovich-Wasserstein distance of the Fekete points to its limiting measure. The sampling and interpolation arrays on line bundles are a subject of independent interest, and we provide necessary density conditions through the classical approach of Landau, that in this context measures the local dimension of the space of sections of the line bundle. We obtain a complete geometric characterization of sampling and interpolation arrays in the case of compact manifolds of dimension one, and we prove that there are no arrays of both sampling and interpolation in the more general setting of semipositive line bundles.
Resumo:
A key issue in the implementation of the Water Framework Directive is the classification of streams and rivers using biological quality parameters and type-specific reference conditions. Four groups of stream types were defined in NE Spain on the basis of 152 diatom samples by means of detrended correspondence analysis and classification techniques. Diatom analysis was restricted to epilithic taxa, and the sites included gradients ranging from near-natural streams to sites with poor ecological quality. The main gradient shows a clear separation of sites in relation to the degree of human influence: polluted streams (mainly located in the lowlands) differ from streams in mountainous areas and in the Pyrenees. A second gradient is related to physiographical features. Headwater streams can be distinguished by their catchment geology. The type-specific diatom taxa for the stream types studied were determined by using indicator species analysis (IndVal). The type-specific taxa from near-natural streams are coincident with the indicator taxa for high ecological status. Human impact reduced the typological heterogeneity of the diatom community composition. Overall, the diatom communities in NE Spain exhibit a regional distribution pattern that closely corresponds with that observed in river systems elsewhere. Physiographical differences are only evident in undisturbed sites, while nutrient enrichment and other human disturbances may mask the regional differences in the distribution of diatom communities
Resumo:
We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.
Resumo:
Rat hindlimb muscles constitutively express the inducible heat shock protein 72 (Hsp70), apparently in proportion to the slow myosin content. Since it remains controversial whether chronic Hsp70 expression reflects the overimposed stress, we investigated Hsp70 cellular distribution in fast muscles of the posterior rat hindlimb after (1) mild exercise training (up to 30 m/min treadmill run for 1 h/day), which induces a remodeling in fast fiber composition, or (2) prolonged exposure to normobaric hypoxia (10%O(2)), which does not affect fiber-type composition. Both conditions increased significantly protein Hsp70 levels in the skeletal muscle. Immunohistochemistry showed the labeling for Hsp70 in subsets of both slow/type 1 and fast/type 2A myofibers of control, sedentary, and normoxic rats. Endurance training increased about threefold the percentage of Hsp70-positive myofibers (P < 0.001), and changed the distribution of Hsp70 immunoreactivity, which involved a larger subset of both type 2A and intermediate type 2A/2X myofibers (P < 0.001) and vascular smooth muscle cells. Hypoxia induced Hsp70 immunoreactivity in smooth muscle cells of veins and did not increase the percentage of Hsp70-positive myofibers; however, sustained exposure to hypoxia affected the distribution of Hsp70 immunoreactivity, which appeared detectable in a very small subset of type 2A fibers, whereas it concentrated in type 1 myofibers (P < 0.05) together with the labeling for heme-oxygenase isoform 1, a marker of oxidative stress. Therefore, the chronic induction of Hsp70 expression in rat skeletal muscles is not obligatory related to the slow fiber phenotype but reveals the occurrence of a stress response.
Resumo:
Les crues et les risques de débordement des barrages, notamment des digues en terre, en cas de fortes précipitations, préoccupent depuis longtemps les autorités et la population. Les études réalisées dans les dernières années ont montré que le réchauffement global du climat s'est accompagné d'une augmentation de la fréquence des fortes précipitations et des crues en Suisse et dans de nombreuses régions du globe durant le 20ème siècle. Les modèles climatiques globaux et régionaux prévoient que la fréquence des fortes précipitations devrait continuer à croître durant le 21éme siècle en Suisse et dans le monde. Cela rend les recherches actuelles sur la modélisation des pluies et des crues à une échelle fine encore plus importantes. En Suisse, pour assurer une bonne protection sur le plan humain et économique, des cartes de précipitations maximales probables (PMP) ont été réalisées. Les PMP ont été confrontées avec les précipitations extrêmes mesurées dans les différentes régions du pays. Ces PMP sont ensuite utilisées par les modèles hydrologiques pour calculer des crues maximales probables (PMF). Cette la méthode PMP-PMF nécessite toutefois un certain nombre de précautions. Si elle est appliquée d'une manière incorrecte ou sur la base de données insuffisantes, elle peut entraîner une surestimation des débits de crue, notamment pour les grands bassins et pour les régions montagneuses entraînant des surcoûts importants. Ces problèmes résultent notamment du fait que la plupart des modèles hydrologiques répartissent les précipitations extrêmes (PMP) de manière uniforme dans le temps sur l'ensemble du bassin versant. Pour remédier ce problème, cette thèse a comme objectif principal de développer un modèle hydrologique distribué appelé MPF (Modeling Precipitation Flood) capable d'estimer la PMF de manière réaliste à partir de la PMP distribuée de manière spatio-temporelle à l'aide des nuages. Le modèle développé MPF comprend trois parties importantes. Dans la première partie, les précipitations extrêmes calculées par un modèle météorologique à une méso-échelle avec une résolution horizontale de 2 km, sont réparties à une échelle locale (25 ou 50 m) de manière non-uniforme dans l'espace et dans le temps. La deuxième partie concerne la modélisation de l'écoulement de l'eau en surface et en subsurface en incluant l'infiltration et l'exfiltration. Et la troisième partie inclut la modélisation de la fonte des neiges, basée sur un calcul de transfert de chaleur. Le modèle MPF a été calibré sur des bassins versants alpins où les données de précipitations et du débit sont disponibles pour une période considérablement longue, qui inclut plusieurs épisodes de fortes pluies avec des débits élevés. À partir de ces épisodes, les paramètres d'entrée du modèle tel que la rugosité du sol et la largeur moyenne des cours d'eau dans le cas d'écoulement de surface ont pu être estimés. Suivant la même procédure, les paramètres utilisés dans la simulation des écoulements en subsurface sont également estimés indirectement, puisque des mesures directes de l'écoulement en subsurface et de l'exfiltration sont difficiles à obtenir. Le modèle de distribution spatio-temporelle de la pluie a aussi été validé en utilisant les images radar avec la structure de la pluie provoquée par un nuage supercellulaire. Les hyétogrammes obtenus sur plusieurs points du terrain sont très proches de ceux enregistrées avec les images radar. Les résultats de la validation du modèle sur les épisodes de fortes crues présentent une bonne synchronisation entre le débit simulé et le débit observé. Cette corrélation a été mesurée avec trois critères d'efficacité, qui ont tous donné des valeurs satisfaisantes. Cela montre que le modèle développé est valide et il peut être utilisé pour des épisodes extrêmes tels que la PMP. Des simulations ont été faites sur plusieurs bassins ayant comme données d'entrée la pluie de type PMP. Des conditions variées ont été utilisées, comme la situation du sol saturé, ou non-saturé, ou la présence d'une couche de neige sur le terrain au moment de la PMP, ce qui conduit à une estimation de PMF pour des scénarios catastrophiques. Enfin, les résultats obtenus montrent comment mieux estimer la valeur de la crue de sécurité des barrages, à partir d'une pluie extrême dix-millennale avec une période de retour de 10'000 ans.
Resumo:
Koska kartonki on hygroskooppista, sen kosteus on riippuvainen ilman lämpötilasta ja suhteellisesta kosteudesta. Liian korkea kosteus kartongissa aiheuttaa jäykkyyden alenemista, kartongin sitkeyttä, sekä pullistuneita pakkauksia kuluttajilla. Kosteusongelmia syntyy varastoinnissa, sekä kylmän rullan käsittelyn aikana lämpimissä tuotantotiloissa. Rullaa tulisi säilyttää tiiviissä paketissa, jotta se olisi suojattu ympäristön vaihtelevilta olosuhteilta. Jos pakkaus ei ole tiivis ja ympäröivä ilma pääsee kontaktiin kartongin kanssa kylmävarastoinnin aikana, nousee kartongin kosteus jo parissa vuorokaudessa yli 7 %:iin ilman lämpötilan ollessa 15 °C ja suhteellisen kosteuden 95 %. Kirjallisuusosan tavoitteena oli kartoittaa tekijät, joilla on vaikutusta kuitujen ominaisuuteen imeä itseensä kosteutta ympäröivästä ilmasta. Kokeellisessa osassa selvitettiin, oliko kartongin kosteuden nousu johtunut tasapainokosteuden nousemisesta. Painopisteenä oli tutkia kosteuden muutokset varastoinnin ja konvertoinnin sekä pakkaustuotteen käytön aikana. Simuloinnin avulla määritettiin kosteuden penetroituminen rullaan. Tutkittiin pakkausmateriaalien vesihöyrytiiveydet ja käärinnän vaikutus rullapakkauksen sisäisiin olosuhteisiin. Nestepakkauskartongin tasapainokosteus ei ollut muuttunut. Käytännön varastointikokeiden avulla havaittiin, että pinon päällimmäinen rulla altistui eniten ympäröivän ilman olosuhteiden muutoksille. Havaittiin myös kosteuseroa tämän rullan reunojen välillä. Mitä lyhyempi on varastointiaika kartongin päällystyksen jälkeen, sitä paremmin ehkäistään kosteuden noususta aiheutuvat haitat kartongissa. Maksimivarastointiaika rullille on 1,5 kuukautta, jolloin kartongin kosteus oli 6,5 % ja pinon päällimmäisessä rullassa noin 6,7 %. Pitkään varastoidusta kartongista valmistetuissa maitotölkeissä oli korkein kosteus, suurin pullistuma ja alhaisin otejäykkyys 10 päivän jälkeen täytöstä. Varastointiajan ollessa pitkä, kartongista tulee sitkeää ja venyvää, ja se sitoo enemmän kosteutta kuin mitä se on sitonut ennen varastoon tuloa. Skaivaus pienentää kosteuspenetraatiota raakakartonkiin, jolloin jäykkyys säilyy ja pullistuma pienenee pakkauksessa.
Resumo:
The objective of the thesis was to examine the possibilities in designing better performing nozzles for the heatset drying oven in Forest Pilot Center. To achieve the objective, two predesigned nozzle types along with the replicas of the current nozzles in the heatset drying oven were tested on a pilot-scale dryer. During the runnability trials, the pilot dryer was installed between the last printing unit and the drying oven. The two sets of predesigned nozzles were consecutively installed in the dryer. Four web tension values and four different impingement air velocities were used and the web behavior during the trial points was evaluated and recorded. The runnability in all trial conditions was adequate or even good. During the heat transfer trials, each nozzle type was tested on at least two different nozzle-to-surface distances and four different impingement air velocities. In a test situation, an aluminum plate fitted with thermocouples was set below a nozzle and the temperature measurement of each block was logged. From the measurements, a heat transfer coefficient profile for the nozzle was calculated. The performance of each nozzle type in tested conditions could now be rated and compared. The results verified that the predesigned simpler nozzles were better than the replicas. For runnability reasons, there were rows of inclined orifices on the leading and trailing edges of the current nozzles. They were believed to deteriorate the overall performance of the nozzle, and trials were conducted to test this hypothesis. The perpendicular orifices and inclined orifices of a replica nozzle were consecutively taped shut and the performance of the modified nozzles was measured as before, and then compared to the performance of the whole nozzle. It was found out, that after a certain nozzle-to-surface distance the jets from the two nozzles would collide, which deteriorates the heat transfer.
Resumo:
PbO2 films were electroformed onto carbon cloth substrates (twill woven type) in acid conditions using the nitrate precursor by changing the electrodeposition current density, temperature and pH, in order to optimize the formation of the β-PbO2 phase. The crystal structure and morphology of the PbO2 films were investigated using X-ray diffraction (XRD) and scanning electronic microscopy (SEM) techniques. The optimum conditions obtained for formation of the β-PbO2 were presented and discussed.
Resumo:
A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.
Resumo:
Optimization methods that employ the classical Powell-Hestenes-Rockafellar augmented Lagrangian are useful tools for solving nonlinear programming problems. Their reputation decreased in the last 10 years due to the comparative success of interior-point Newtonian algorithms, which are asymptotically faster. In this research, a combination of both approaches is evaluated. The idea is to produce a competitive method, being more robust and efficient than its `pure` counterparts for critical problems. Moreover, an additional hybrid algorithm is defined, in which the interior-point method is replaced by the Newtonian resolution of a Karush-Kuhn-Tucker (KKT) system identified by the augmented Lagrangian algorithm. The software used in this work is freely available through the Tango Project web page:http://www.ime.usp.br/similar to egbirgin/tango/.
Resumo:
Os autores objetivam, com este trabalho preliminar, bem como com aqueles que lhe darão continuidade, na sequência de composição de um livro de matemática para economistas, registrar as suas experiências ao longo dos últimos anos ministrando cadeiras de matemática nos cursos de pós-graduação em economia da Fundação Getúlio Vargas, da UFF (Universidade Federal Fluminense) e da PUC-RJ. Reveste-se de constante repetição em tais cursos a discussão sobre que pontos abordar, bem como com qual grau de profundidade, e em que ordem. É neste sentido que os autores esperam, com a sequência didática que aqui se inicia, trazer alguma contribuição para o assunto. A parte teórica relativa à demonstração do Teorema de Kuhn Tucker aqui apresentada transcreve, com a aquiescência do autor, textos selecionados de "Análise Convexa do Rn." de Mario Henrique Simonsen.
Resumo:
Organizations have shown a speech focused on Knowledge Management and share of knowledge that reveal the relevance of both the age of knowledge and the capture and transference of individual and colective knowledge. Due to its continued repetition, the speech has been legitimated, not establishing, however, what can be shared, which methods shall provide it and which personal and corporate reasons can stimulate sharing the knowledge. This speech tries to convince that the implementation depends exclusively on the actions promoted by the firm. It also admits the presence of necessary environment conditions to make share of knowledge possible. The speech makes one believe that individuals and teams, formal or informal, are prepared and desire to share their knowledge, not considering feelings, wishes or availability to do so. The research has revealed the difficulty of sharing knowledge within service companies, which are strongly results oriented, due to the competition among employees in order to catch better jobs, the fight to retain power and the lack of time to develop daily tasks. Such organizations prevent people from the desire and the availability to share knowledge.
Resumo:
O objetivo do presente estudo é testar a plausibilidade da tese de que teria ocorrido mutação constitucional no sistema de controle de constitucionalidade no Brasil, especificamente quanto à competência conferida pela Constituição ao Senado Federal para suspender a eficácia de lei ou ato normativo declarado inconstitucional no controle difuso de constitucionalidade. Esse argumento foi empregado pelo Ministro Gilmar Mendes, do Supremo Tribunal Federal, no julgamento da Reclamação 4335-5/AC. Para analisar a consistência dessa tese, (a) discutiu-se as condições necessárias à plausibilidade de um argumento de mutação constitucional na ordem jurídica brasileira e (b) realizou-se uma análise empírica do papel do Senado Federal diante do art. 52, X da Constituição. Feita a coleta de dados, e aplicado o conceito de mutação constitucional sobre eles, concluiu-se que não merece prosperar a tese do Min. Gilmar Mendes em que defende o reconhecimento de “autêntica mutação constitucional” do art. 52, X da Constituição Federal. Como resultado, defende se não ser possível dar plausibilidade a esse tipo de argumentação apenas tendo por base jurisprudência, doutrina e argumentação tipicamente constitucional, necessitando de base empírica que o permita atribuir mais solidez e consistência a qualquer argumento de mutação constitucional na ordem jurídica brasileira.