960 resultados para Monopile foundations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines foundational questions in behavioral economics—also called psychology and economics—and the neural foundations of varied sources of utility. We have three primary aims: First, to provide the field of behavioral economics with psychological theories of behavior that are derived from neuroscience and to use those theories to identify novel evidence for behavioral biases. Second, we provide neural and micro foundations of behavioral preferences that give rise to well-documented empirical phenomena in behavioral economics. Finally, we show how a deep understanding of the neural foundations of these behavioral preferences can feed back into our theories of social preferences and reference-dependent utility.

The first chapter focuses on classical conditioning and its application in identifying the psychological underpinnings of a pricing phenomenon. We return to classical conditioning again in the third chapter where we use fMRI to identify varied sources of utility—here, reference dependent versus direct utility—and cross-validate our interpretation with a conditioning experiment. The second chapter engages social preferences and, more broadly, causative utility (wherein the decision-maker derives utility from making or avoiding particular choices).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dissertation is concerned with the mathematical study of various network problems. First, three real-world networks are considered: (i) the human brain network (ii) communication networks, (iii) electric power networks. Although these networks perform very different tasks, they share similar mathematical foundations. The high-level goal is to analyze and/or synthesis each of these systems from a “control and optimization” point of view. After studying these three real-world networks, two abstract network problems are also explored, which are motivated by power systems. The first one is “flow optimization over a flow network” and the second one is “nonlinear optimization over a generalized weighted graph”. The results derived in this dissertation are summarized below.

Brain Networks: Neuroimaging data reveals the coordinated activity of spatially distinct brain regions, which may be represented mathematically as a network of nodes (brain regions) and links (interdependencies). To obtain the brain connectivity network, the graphs associated with the correlation matrix and the inverse covariance matrix—describing marginal and conditional dependencies between brain regions—have been proposed in the literature. A question arises as to whether any of these graphs provides useful information about the brain connectivity. Due to the electrical properties of the brain, this problem will be investigated in the context of electrical circuits. First, we consider an electric circuit model and show that the inverse covariance matrix of the node voltages reveals the topology of the circuit. Second, we study the problem of finding the topology of the circuit based on only measurement. In this case, by assuming that the circuit is hidden inside a black box and only the nodal signals are available for measurement, the aim is to find the topology of the circuit when a limited number of samples are available. For this purpose, we deploy the graphical lasso technique to estimate a sparse inverse covariance matrix. It is shown that the graphical lasso may find most of the circuit topology if the exact covariance matrix is well-conditioned. However, it may fail to work well when this matrix is ill-conditioned. To deal with ill-conditioned matrices, we propose a small modification to the graphical lasso algorithm and demonstrate its performance. Finally, the technique developed in this work will be applied to the resting-state fMRI data of a number of healthy subjects.

Communication Networks: Congestion control techniques aim to adjust the transmission rates of competing users in the Internet in such a way that the network resources are shared efficiently. Despite the progress in the analysis and synthesis of the Internet congestion control, almost all existing fluid models of congestion control assume that every link in the path of a flow observes the original source rate. To address this issue, a more accurate model is derived in this work for the behavior of the network under an arbitrary congestion controller, which takes into account of the effect of buffering (queueing) on data flows. Using this model, it is proved that the well-known Internet congestion control algorithms may no longer be stable for the common pricing schemes, unless a sufficient condition is satisfied. It is also shown that these algorithms are guaranteed to be stable if a new pricing mechanism is used.

Electrical Power Networks: Optimal power flow (OPF) has been one of the most studied problems for power systems since its introduction by Carpentier in 1962. This problem is concerned with finding an optimal operating point of a power network minimizing the total power generation cost subject to network and physical constraints. It is well known that OPF is computationally hard to solve due to the nonlinear interrelation among the optimization variables. The objective is to identify a large class of networks over which every OPF problem can be solved in polynomial time. To this end, a convex relaxation is proposed, which solves the OPF problem exactly for every radial network and every meshed network with a sufficient number of phase shifters, provided power over-delivery is allowed. The concept of “power over-delivery” is equivalent to relaxing the power balance equations to inequality constraints.

Flow Networks: In this part of the dissertation, the minimum-cost flow problem over an arbitrary flow network is considered. In this problem, each node is associated with some possibly unknown injection, each line has two unknown flows at its ends related to each other via a nonlinear function, and all injections and flows need to satisfy certain box constraints. This problem, named generalized network flow (GNF), is highly non-convex due to its nonlinear equality constraints. Under the assumption of monotonicity and convexity of the flow and cost functions, a convex relaxation is proposed, which always finds the optimal injections. A primary application of this work is in the OPF problem. The results of this work on GNF prove that the relaxation on power balance equations (i.e., load over-delivery) is not needed in practice under a very mild angle assumption.

Generalized Weighted Graphs: Motivated by power optimizations, this part aims to find a global optimization technique for a nonlinear optimization defined over a generalized weighted graph. Every edge of this type of graph is associated with a weight set corresponding to the known parameters of the optimization (e.g., the coefficients). The motivation behind this problem is to investigate how the (hidden) structure of a given real/complex valued optimization makes the problem easy to solve, and indeed the generalized weighted graph is introduced to capture the structure of an optimization. Various sufficient conditions are derived, which relate the polynomial-time solvability of different classes of optimization problems to weak properties of the generalized weighted graph such as its topology and the sign definiteness of its weight sets. As an application, it is proved that a broad class of real and complex optimizations over power networks are polynomial-time solvable due to the passivity of transmission lines and transformers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A spectral-filter method is numerically demonstrated to obtain sub-5 fs pulses by using femtosecond filamentation in fused silica. Instead of employing spectral phase compensation, by properly employing a high-pass filter to select the broadened high-frequency spectra that are located almost in phase in the tailing edge of the self-compressed pulses owing to self-steepening, as short as single-cycle pulses can be obtained. For instance, for an input pulse with a duration of 50 fs and energy 2.2 mu J, the minimum pulse duration can reach to similar to 4 fs (about 1.5 cycles) by applying a proper spectral filter. (C) 2008 Optical Society of America

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This series will include all those people who, by means of their contributions, great and small, played a part in the consolidation of ichthyology in Argentina. The general plan of this work consists of individual factsheets containing a list of works by each author, along with reference bibliography and, whenever possible, personal pictures and additional material. The datasheets will be published primarily in chronological order, although this is subject to change by the availability of materials for successive editions. This work represents another approach for the recovery and revalorization of those who set the foundations of Argentine ichthyology while in diverse historical circumstances. I expect this to be the beginning of a major work that achieves the description of such a significant part of the history of natural sciences in Argentina.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This series will include all those people who, by means of their contributions, great and small, played a part in the consolidation of ichthyology in Argentina. The general plan of this work consists of individual factsheets containing a list of works by each author, along with reference bibliography and, whenever possible, personal pictures and additional material. The datasheets will be published primarily in chronological order, although this is subject to change by the availability of materials for successive editions. This work represents another approach for the recovery and revalorization of those who set the foundations of Argentine ichthyology while in diverse historical circumstances. I expect this to be the beginning of a major work that achieves the description of such a significant part of the history of natural sciences in Argentina.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This series will include all those people who, by means of their contributions, great and small, played a part in the consolidation of ichthyology in Argentina. The general plan of this work consists of individual factsheets containing a list of works by each author, along with reference bibliography and, whenever possible, personal pictures and additional material. The datasheets will be published primarily in chronological order, although this is subject to change by the availability of materials for successive editions. This work represents another approach for the recovery and revalorization of those who set the foundations of Argentine ichthyology while in diverse historical circumstances. I expect this to be the beginning of a major work that achieves the description of such a significant part of the history of natural sciences in Argentina.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the quest to develop viable designs for third-generation optical interferometric gravitational-wave detectors, one strategy is to monitor the relative momentum or speed of the test-mass mirrors, rather than monitoring their relative position. The most straightforward design for a speed-meter interferometer that accomplishes this is described and analyzed in Chapter 2. This design (due to Braginsky, Gorodetsky, Khalili, and Thorne) is analogous to a microwave-cavity speed meter conceived by Braginsky and Khalili. A mathematical mapping between the microwave speed meter and the optical interferometric speed meter is developed and used to show (in accord with the speed being a quantum nondemolition observable) that in principle the interferometric speed meter can beat the gravitational-wave standard quantum limit (SQL) by an arbitrarily large amount, over an arbitrarily wide range of frequencies . However, in practice, to reach or beat the SQL, this specific speed meter requires exorbitantly high input light power. The physical reason for this is explored, along with other issues such as constraints on performance due to optical dissipation.

Chapter 3 proposes a more sophisticated version of a speed meter. This new design requires only a modest input power and appears to be a fully practical candidate for third-generation LIGO. It can beat the SQL (the approximate sensitivity of second-generation LIGO interferometers) over a broad range of frequencies (~ 10 to 100 Hz in practice) by a factor h/hSQL ~ √W^(SQL)_(circ)/Wcirc. Here Wcirc is the light power circulating in the interferometer arms and WSQL ≃ 800 kW is the circulating power required to beat the SQL at 100 Hz (the LIGO-II power). If squeezed vacuum (with a power-squeeze factor e-2R) is injected into the interferometer's output port, the SQL can be beat with a much reduced laser power: h/hSQL ~ √W^(SQL)_(circ)/Wcirce-2R. For realistic parameters (e-2R ≃ 10 and Wcirc ≃ 800 to 2000 kW), the SQL can be beat by a factor ~ 3 to 4 from 10 to 100 Hz. [However, as the power increases in these expressions, the speed meter becomes more narrow band; additional power and re-optimization of some parameters are required to maintain the wide band.] By performing frequency-dependent homodyne detection on the output (with the aid of two kilometer-scale filter cavities), one can markedly improve the interferometer's sensitivity at frequencies above 100 Hz.

Chapters 2 and 3 are part of an ongoing effort to develop a practical variant of an interferometric speed meter and to combine the speed meter concept with other ideas to yield a promising third- generation interferometric gravitational-wave detector that entails low laser power.

Chapter 4 is a contribution to the foundations for analyzing sources of gravitational waves for LIGO. Specifically, it presents an analysis of the tidal work done on a self-gravitating body (e.g., a neutron star or black hole) in an external tidal field (e.g., that of a binary companion). The change in the mass-energy of the body as a result of the tidal work, or "tidal heating," is analyzed using the Landau-Lifshitz pseudotensor and the local asymptotic rest frame of the body. It is shown that the work done on the body is gauge invariant, while the body-tidal-field interaction energy contained within the body's local asymptotic rest frame is gauge dependent. This is analogous to Newtonian theory, where the interaction energy is shown to depend on how one localizes gravitational energy, but the work done on the body is independent of that localization. These conclusions play a role in analyses, by others, of the dynamics and stability of the inspiraling neutron-star binaries whose gravitational waves are likely to be seen and studied by LIGO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Edge Function method formerly developed by Quinlan(25) is applied to solve the problem of thin elastic plates resting on spring supported foundations subjected to lateral loads the method can be applied to plates of any convex polygonal shapes, however, since most plates are rectangular in shape, this specific class is investigated in this thesis. The method discussed can also be applied easily to other kinds of foundation models (e.g. springs connected to each other by a membrane) as long as the resulting differential equation is linear. In chapter VII, solution of a specific problem is compared with a known solution from literature. In chapter VIII, further comparisons are given. The problems of concentrated load on an edge and later on a corner of a plate as long as they are far away from other boundaries are also given in the chapter and generalized to other loading intensities and/or plates springs constants for Poisson's ratio equal to 0.2

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O estudo situa-se no âmbito das investigações voltadas para documentos que sistematizam o trabalho do professor, dentre eles, o manual do professor que organiza a atividade docente junto ao livro didático. A dissertação analisa manuais do professor dos livros de espanhol selecionados pelo MEC para serem distribuídos a professores, em 2005, em função da lei 11161 da obrigatoriedade do ensino da língua espanhola para o ensino médio em todo o território nacional. O objetivo foi identificar imagens discursivas de docente e de ensino de espanhol como língua estrangeira neles construídas. Os fundamentos teóricos adotados advêm da Análise do Discurso de base enunciativa, além de recorrermos aos conceitos de dialogismo (BAKHTIN, 1979) e de polifonia (BAKHTIN, 1979; DUCROT, 1987). Os resultados nesses manuais apontam para a construção de imagens de professor como: aquele que necessita ser guiado em sua tarefa, incapaz de realizar suas escolhas em sala de aula, um professor recebedor de ordens; desatualizado com as metodologias de ensino atuais, necessitando, portanto, de atualização profissional; há ainda um professor que busca instruções facilitadoras para seu trabalho. Já no que se refere à visão de língua, deparamo-nos com um manual que dá ênfase ao trabalho com a leitura, voltado para uma concepção que valoriza aspectos discursivos; outros que afirmam seguir a abordagem comunicativa, com um olhar para a língua em uso, porém adotam procedimentos pautados numa concepção de língua como estrutura e/ou misturam ambas perspectivas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES]El South Pointing Chariot es un antiguo mecanismo de origen chino usado en sus inicios como instrumento de orientación, siendo su principal característica la existencia de un punto que siempre tiene velocidad angular absoluta nula. Tomando sus fundamentos como base, es posible realizar sistemas de guiado en direcciones estacionarias con múltiples aplicaciones en la actualidad, pudiendo usarse, por ejemplo, como sistema estabilizador. Para ello, debe realizarse su análisis cinemático, para comprender su funcionamiento, y comprobar que los resultados son correctos, pudiéndose hacer esto mediante un prototipo. La realización del prototipo engloba su diseño y posterior fabricación.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A presente dissertação tem por objetivo analisar a solidariedade social e sua projeção no sistema constitucional brasileiro, buscando definir seus contornos, fundamentos e limites na efetivação de políticas públicas e decisões judiciais. Ademais, busca-se na presente dissertação demarcar os limites e possibilidades da solidariedade enquanto valor que norteia o campo da política, notadamente na prática democrática. O estudo parte de uma análise histórica e filosófica para contextualizar a solidariedade como princípio jurídico que fundamenta direitos e deveres e que encontra nas demandas por reconhecimento das diferenças seu maior campo de incidência. Na política, a solidariedade se abre à opção de uma democracia anti-elitista que tem no conflito, na tolerância e nas divergências as pedras de toque que proporcionam uma dinâmica que respeita as diferenças e geram cooperação social por conta dessa estima intersubjetiva. Os deveres de reconhecimento intersubjetivo e de estima social possibilitam uma construção social dialógica e interacional, na qual seus sujeitos são respeitados como seres livres e iguais, dignos de igual respeito e consideração. Tal afirmativa é colocada a prova quando da viabilidade constitucional da cota racial nas Universidades Públicas brasileiras. Da mesma forma, a solidariedade se projeta para o campo jurídico devido a sua positivação na Constituição brasileira de 1988 como princípio/objetivo fundamental da República Federativa do Brasil. Deste modo, a jurisprudência do Supremo Tribunal Federal vem lançando mão do princípio fundamental da solidariedade para fundamentar decisões que envolvam deveres fundamentais de redistribuição e reconhecimento. Tais decisões nos permitem traçar um conteúdo mínimo desse princípio à luz da ordem social e cultural brasileira que, aliás, não foi deixada de lado em nenhum momento no decorrer do estudo. Esse conteúdo material encontra nos deveres de redistribuição e reconhecimento, principalmente neste último, seu suporte de eficácia jurídica, viabilizando, em certos momentos, até uma aplicação direta da solidariedade por meio dos deveres.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho visa à análise crítica dos instrumentos jurídicos utilizados para a defesa do meio ambiente, em especial o licenciamento ambiental. Traçando os fundamentos teóricos do direito ambiental e passando, em seguida, ao exame dos instrumentos propriamente ditos, esta tese de doutoramento apresenta estudo de caso da implantação do Complexo Petroquímico do Estado do Rio de Janeiro COMPERJ, trazendo as irregularidades jurídicas e técnicas que marcaram seu processo de licenciamento. Por fim, o trabalho aponta deficiências ligadas à atuação do Ministério Público, à ingerência política num procedimento que se suporia técnico e à pouca efetividade dos dispositivos que garantem a efetiva participação popular, trazendo algumas sugestões para a minimização destes problemas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES]En este trabajo se hace un análisis y una comparación de las diferentes formas estructurales que puede adoptar una marquesina de aparcamiento, principalmente estructuras hechas en madera y acero. Para la realización del citado análisis se han valorado las ventajas y desventajas de los materiales, funcionalidad, mantenimiento, estética, dificultad de ejecución y economicidad, realizándose asimismo un diseño de marquesina utilizando uno u otro de los materiales citados. Para la ejecución de la marquesina, se formula un diseño que cumplimente los condicionantes de funcionalidad del elemento. A su vez, se analiza, parte por parte, todo el sistema estructural, empezando por determinar las acciones que van a aparecer y deberán ser transmitidas al terreno. Una vez obtenidos los esfuerzos que se han de transmitir, se formula una hipótesis de calidad del terreno de cimentación, con el fin de determinar las características de la cimentación. Previamente se realiza un predimensionado del sistema estructural, para cuyo cálculo se ha utilizado el programa informático Tricalc. Obtenidas las dimensiones exactas y los detalles constructivos para cada una de las opciones de materiales a analizar (madera y acero), se evalúa económicamente cada diseño obtenido, para determinar la solución más económica y rentable desde el punto de vista, por ejemplo del montaje y desmontaje, así como por el mantenimiento. Además de las funciones estructurales, también se analiza la adaptación al entorno de la madera y el acero.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis studies decision making under uncertainty and how economic agents respond to information. The classic model of subjective expected utility and Bayesian updating is often at odds with empirical and experimental results; people exhibit systematic biases in information processing and often exhibit aversion to ambiguity. The aim of this work is to develop simple models that capture observed biases and study their economic implications.

In the first chapter I present an axiomatic model of cognitive dissonance, in which an agent's response to information explicitly depends upon past actions. I introduce novel behavioral axioms and derive a representation in which beliefs are directionally updated. The agent twists the information and overweights states in which his past actions provide a higher payoff. I then characterize two special cases of the representation. In the first case, the agent distorts the likelihood ratio of two states by a function of the utility values of the previous action in those states. In the second case, the agent's posterior beliefs are a convex combination of the Bayesian belief and the one which maximizes the conditional value of the previous action. Within the second case a unique parameter captures the agent's sensitivity to dissonance, and I characterize a way to compare sensitivity to dissonance between individuals. Lastly, I develop several simple applications and show that cognitive dissonance contributes to the equity premium and price volatility, asymmetric reaction to news, and belief polarization.

The second chapter characterizes a decision maker with sticky beliefs. That is, a decision maker who does not update enough in response to information, where enough means as a Bayesian decision maker would. This chapter provides axiomatic foundations for sticky beliefs by weakening the standard axioms of dynamic consistency and consequentialism. I derive a representation in which updated beliefs are a convex combination of the prior and the Bayesian posterior. A unique parameter captures the weight on the prior and is interpreted as the agent's measure of belief stickiness or conservatism bias. This parameter is endogenously identified from preferences and is easily elicited from experimental data.

The third chapter deals with updating in the face of ambiguity, using the framework of Gilboa and Schmeidler. There is no consensus on the correct way way to update a set of priors. Current methods either do not allow a decision maker to make an inference about her priors or require an extreme level of inference. In this chapter I propose and axiomatize a general model of updating a set of priors. A decision maker who updates her beliefs in accordance with the model can be thought of as one that chooses a threshold that is used to determine whether a prior is plausible, given some observation. She retains the plausible priors and applies Bayes' rule. This model includes generalized Bayesian updating and maximum likelihood updating as special cases.