964 resultados para pecking order theory
Resumo:
Dissertação de Mestrado, Ciências Económicas e Empresariais, 8 de Janeiro de 2015, Universidade dos Açores.
Resumo:
Dual-phase functionally graded materials are a particular type of composite materials whose properties are tailored to vary continuously, depending on its two constituent's composition distribution, and which use is increasing on the most diverse application fields. These materials are known to provide superior thermal and mechanical performances when compared to the traditional laminated composites, exactly because of this continuous properties variation characteristic, which enables among other advantages smoother stresses distribution profile. In this paper we study the influence of different homogenization schemes, namely the schemes due to Voigt, Hashin-Shtrikman and Mod-Tanaka, which can be used to obtain bounds estimates for the material properties of particulate composite structures. To achieve this goal we also use a set of finite element models based on higher order shear deformation theories and also on first order theory. From the studies carried out, on linear static analyses and on free vibration analyses, it is shown that the bounds estimates are as important as the deformation kinematics basis assumed to analyse these types of multifunctional structures. Concerning to the homogenization schemes studied, it is shown that Mori-Tanaka and Hashin-Shtrikman estimates lead to less conservative results when compared to Voigt rule of mixtures.
Resumo:
Este trabalho estuda e compara os fatores determinantes do endividamento em empresas do sector das manufacturas de 7 países do Sul da Europa e da Escandinávia para o ano 2008. Os resultados encontrados sugerem que não existem diferenças significativas entre esses países, não obstante os níveis de endividamento médio entre as empresas dos vários países apresentarem diferenças significativas. No estudo utilizou-se o modelo de regressão fraccionário para estimar o modelo, dadas as fragilidades que as outras formas funcionais apresentam em situações em que a variável dependente representa uma proporção. Rendibilidade (-); Crescimento (+); e Liquidez (-) evidenciam a superioridade da teoria de pecking order em relação à teoria do trade-off e custos de agência. Outras fontes de protecção fiscal (-), tangibilidade (+) e idade (-) são também factores importantes para a explicação do endividamento das empresas analisadas.
Resumo:
We investigate expressiveness and definability issues with respect to minimal models, particularly in the scope of Circumscription. First, we give a proof of the failure of the Löwenheim-Skolem Theorem for Circumscription. Then we show that, if the class of P; Z-minimal models of a first-order sentence is Δ-elementary, then it is elementary. That is, whenever the circumscription of a first-order sentence is equivalent to a first-order theory, then it is equivalent to a finitely axiomatizable one. This means that classes of models of circumscribed theories are either elementary or not Δ-elementary. Finally, using the previous result, we prove that, whenever a relation Pi is defined in the class of P; Z-minimal models of a first-order sentence Φ and whenever such class of P; Z-minimal models is Δ-elementary, then there is an explicit definition ψ for Pi such that the class of P; Z-minimal models of Φ is the class of models of Φ ∧ ψ. In order words, the circumscription of P in Φ with Z varied can be replaced by Φ plus this explicit definition ψ for Pi.
Resumo:
This thesis examines the determinants of financial leverage ratio of large publicly listed companies within Nordic Telecom sector. The study is done as a case study and it covers 5 case companies headquartered in Nordic countries during period of 2002 - 2014 and by using restated values of quarterly observations from each case company’s interim reports. The chosen hypotheses are tested with multiple linear regressions firm by firm. The Findings of the study showed that uniqueness of Telecom sector and the region of our sample could not provide us unequivocal determinants of leverage ratio within the sector. However, e.g. Pecking order theory’s statement of Liquidity was widely confirmed by 3 out of 5 case companies which is worth to be taken into account in the big picture. The findings also showed that theories and earlier empirical evidence are confirmed by our case companies individually and non-systematically. Though Telecom sector is considered as quite unique industry and we did not discover absolute common relationships that would have held through all the Nordic case companies, we got unique and valuable evidence to conduct the research of this sector in future.
Resumo:
Basic relationships between certain regions of space are formulated in natural language in everyday situations. For example, a customer specifies the outline of his future home to the architect by indicating which rooms should be close to each other. Qualitative spatial reasoning as an area of artificial intelligence tries to develop a theory of space based on similar notions. In formal ontology and in ontological computer science, mereotopology is a first-order theory, embodying mereological and topological concepts, of the relations among wholes, parts, parts of parts, and the boundaries between parts. We shall introduce abstract relation algebras and present their structural properties as well as their connection to algebras of binary relations. This will be followed by details of the expressiveness of algebras of relations for region based models. Mereotopology has been the main basis for most region based theories of space. Since its earliest inception many theories have been proposed for mereotopology in artificial intelligence among which Region Connection Calculus is most prominent. The expressiveness of the region connection calculus in relational logic is far greater than its original eight base relations might suggest. In the thesis we formulate ways to automatically generate representable relation algebras using spatial data based on region connection calculus. The generation of new algebras is a two pronged approach involving splitting of existing relations to form new algebras and refinement of such newly generated algebras. We present an implementation of a system for automating aforementioned steps and provide an effective and convenient interface to define new spatial relations and generate representable relational algebras.
Resumo:
Qualitative spatial reasoning (QSR) is an important field of AI that deals with qualitative aspects of spatial entities. Regions and their relationships are described in qualitative terms instead of numerical values. This approach models human based reasoning about such entities closer than other approaches. Any relationships between regions that we encounter in our daily life situations are normally formulated in natural language. For example, one can outline one's room plan to an expert by indicating which rooms should be connected to each other. Mereotopology as an area of QSR combines mereology, topology and algebraic methods. As mereotopology plays an important role in region based theories of space, our focus is on one of the most widely referenced formalisms for QSR, the region connection calculus (RCC). RCC is a first order theory based on a primitive connectedness relation, which is a binary symmetric relation satisfying some additional properties. By using this relation we can define a set of basic binary relations which have the property of being jointly exhaustive and pairwise disjoint (JEPD), which means that between any two spatial entities exactly one of the basic relations hold. Basic reasoning can now be done by using the composition operation on relations whose results are stored in a composition table. Relation algebras (RAs) have become a main entity for spatial reasoning in the area of QSR. These algebras are based on equational reasoning which can be used to derive further relations between regions in a certain situation. Any of those algebras describe the relation between regions up to a certain degree of detail. In this thesis we will use the method of splitting atoms in a RA in order to reproduce known algebras such as RCC15 and RCC25 systematically and to generate new algebras, and hence a more detailed description of regions, beyond RCC25.
Resumo:
The present study investigated how social-cognitive development relates to children’s lie-telling and the effectiveness of a novel honesty promoting technique (i.e., self-awareness). Sixty-four children were asked not to peek at a toy in the experimenter’s absence and were later asked about whether they had peeked as a measure of their honesty. Half of the children were questioned in the self-awareness condition and half in the control condition. Additionally, children completed a battery of cognitive and social-cognitive tests to assess executive functioning and theory-of-mind understanding. While first-order theory-of-mind understanding, inhibitory control, and visuospatial working memory did not significantly relate to children’s lie-telling, measures of inhibitory control in conjunction with working memory and complex working memory were significantly related to children’s lie-telling. Finally, the novel honesty promoting technique was effective: children in the self-aware condition lied significantly less often than children in the control condition.
Resumo:
Partant des travaux séminaux de Boole, Frege et Russell, le mémoire cherche à clarifier l‟enjeu du pluralisme logique à l‟ère de la prolifération des logiques non-classiques et des développements en informatique théorique et en théorie des preuves. Deux chapitres plus « historiques » sont à l‟ordre du jour : (1) le premier chapitre articule l‟absolutisme de Frege et Russell en prenant soin de montrer comment il exclut la possibilité d‟envisager des structures et des logiques alternatives; (2) le quatrième chapitre expose le chemin qui mena Carnap à l‟adoption de la méthode syntaxique et du principe de tolérance, pour ensuite dégager l‟instrumentalisme carnapien en philosophie de la Logique et des mathématiques. Passant par l‟analyse d‟une interprétation intuitive de la logique linéaire, le deuxième chapitre se tourne ensuite vers l‟établissement d‟une forme logico-mathématique de pluralisme logique à l‟aide de la théorie des relations d‟ordre et la théorie des catégories. Le troisième chapitre délimite le terrain de jeu des positions entourant le débat entre monisme et pluralisme puis offre un argument contre la thèse qui veut que le conflit entre logiques rivales soit apparent, le tout grâce à l‟utilisation du point de vue des logiques sous-structurelles. Enfin, le cinquième chapitre démontre que chacune des trois grandes approches au concept de conséquence logique (modèle-théorétique, preuve-théorétique et dialogique) forme un cadre suffisamment général pour établir un pluralisme. Bref, le mémoire est une défense du pluralisme logique.
La négociation de l’ordre au sein des interactions entre les membres d’une équipe multidisciplinaire
Resumo:
La présente recherche vise à identifier les mécanismes de résolution de conflits et de maintien de l‟ordre utilisés par les membres d‟une équipe multidisciplinaire pour en arriver à un terrain d‟entente. À cette fin, nous avons analysé les interactions entre les membres d‟une équipe rattachée au programme Personnes âgées en perte d‟autonomie et déficience physique (PAPADP) d‟un CSSS, en nous appuyant sur la Negotiated Order Theory (NOT) développée par Anselm Strauss. Les résultats montrent que l‟expérience de négociation et l‟évidence de légitimité de l‟objectif poursuivi sont les points forts d‟un processus fructueux. Ils démontrent aussi que les catégories d‟analyse proposées par la NOT devraient, comme le recommande Strauss, être adaptées à chaque situation. Enfin, la faible incidence, dans cette étude, du contexte structurel, semble répondre aux critiques faites à la NOT d‟accorder peu d‟intérêt à ce dernier.
Resumo:
Disturbances of arbitrary amplitude are superposed on a basic flow which is assumed to be steady and either (a) two-dimensional, homogeneous, and incompressible (rotating or non-rotating) or (b) stably stratified and quasi-geostrophic. Flow over shallow topography is allowed in either case. The basic flow, as well as the disturbance, is assumed to be subject neither to external forcing nor to dissipative processes like viscosity. An exact, local ‘wave-activity conservation theorem’ is derived in which the density A and flux F are second-order ‘wave properties’ or ‘disturbance properties’, meaning that they are O(a2) in magnitude as disturbance amplitude a [rightward arrow] 0, and that they are evaluable correct to O(a2) from linear theory, to O(a3) from second-order theory, and so on to higher orders in a. For a disturbance in the form of a single, slowly varying, non-stationary Rossby wavetrain, $\overline{F}/\overline{A}$ reduces approximately to the Rossby-wave group velocity, where (${}^{-}$) is an appropriate averaging operator. F and A have the formal appearance of Eulerian quantities, but generally involve a multivalued function the correct branch of which requires a certain amount of Lagrangian information for its determination. It is shown that, in a certain sense, the construction of conservable, quasi-Eulerian wave properties like A is unique and that the multivaluedness is inescapable in general. The connection with the concepts of pseudoenergy (quasi-energy), pseudomomentum (quasi-momentum), and ‘Eliassen-Palm wave activity’ is noted. The relationship of this and similar conservation theorems to dynamical fundamentals and to Arnol'd's nonlinear stability theorems is discussed in the light of recent advances in Hamiltonian dynamics. These show where such conservation theorems come from and how to construct them in other cases. An elementary proof of the Hamiltonian structure of two-dimensional Eulerian vortex dynamics is put on record, with explicit attention to the boundary conditions. The connection between Arnol'd's second stability theorem and the suppression of shear and self-tuning resonant instabilities by boundary constraints is discussed, and a finite-amplitude counterpart to Rayleigh's inflection-point theorem noted
Resumo:
We calculate the spectra of produced thermal photons in Au + Au collisions taking into account the nonequilibrium contribution to photon production due to finite shear viscosity. The evolution of the fireball is modeled by second-order as well as by divergence-type 2 + 1 dissipative hydrodynamics, both with an ideal equation of state and with one based on Lattice QCD that includes an analytical crossover. The spectrum calculated in the divergence-type theory is considerably enhanced with respect to the one calculated in the second-order theory, the difference being entirely due to differences in the viscous corrections to photon production. Our results show that the differences in hydrodynamic formalisms are an important source of uncertainty in the extraction of the value of eta/s from measured photon spectra. The uncertainty in the value of eta/s associated with different hydrodynamic models used to compute thermal photon spectra is larger than the one occurring in matching hadron elliptic flow to RHIC data. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
A teoria financeira e pesquisas internacionais indicam que os acionistas controladores buscam decidir por estruturas de capital que não coloquem em risco a perda do controle acionário da companhia, através de ameaças de takeover, ou mesmo, pelo aumento do nível de risco de falência. Esses estudos exploram o fato de as companhias fazerem emissão de dois tipos de ação, as ações com direito ao voto, e as ações sem direito ao voto. Neste sentido, este trabalho procurou testar a existência de relação da estrutura de controle nas decisões de estrutura de capital das companhias brasileiras listadas no Bovespa – Bolsa de Valores de São Paulo em 31/12/1995, 31/12/1996, 31/12/1997, 31/12/1998, 31/12/1999 e 31/12/2000. De acordo com a análise realizada, pode-se concluir que existe uma influência estatisticamente significativa da estrutura de controle acionário sobre as decisões de estrutura de capital, bem como uma estrutura de decisão do tipo pecking order, voltada para a manutenção do controle acionário, já que a opção pela diluição do controle acionário acontece a partir da redução dos níveis de recursos próprios disponíveis e do crescimento do endividamento e do risco. Além do controle acionário e da lucratividade, o modelo utilizado no estudo testou a influência de variáveis relativas ao setor, ano e tamanho das companhias pesquisadas, como aspectos determinantes das decisões de estrutura de capital no mercado brasileiro.
Resumo:
Foram examinadas as determinantes do nível de liquidez das firmas brasileiras, em um estudo de dados em painel com 295 firmas de capital aberto no período entre 1994 e 2004. Neste estudo, a liquidez foi definida como o estoque de caixa e títulos com liquidez dividido pelo total de ativos da firma. A análise empírica sugere que a liquidez das firmas brasileiras não é determinada por políticas específicas, sendo uma variável endógena, resultante das componentes do fluxo de caixa. Os principais resultados do modelo sugerem que o nível de liquidez é crescente em função do tamanho, do endividamento de curto prazo, e da lucratividade (tanto em termos de fluxo de caixa quanto lucro contábil), enquanto é decrescente em função do risco Brasil e do nível de capital de giro. A teoria do pecking order parece prevalecer nas decisões de financiamento das firmas. Alguns destes resultados, assim como a insignificância de outras variáveis testadas, contrariam as expectativas teóricas de como as firmas deveriam se comportar em relação ao nível de liquidez, com o objetivo de maximizar o valor da firma. O excedente de caixa ocorre nas grandes firmas com alta geração de caixa, e estas firmas podem estar incorrendo elevados custos de agência/expropriação, com prejuízo principalmente para os acionistas minoritários.
Resumo:
Trabalhos internacionais de caráter empírico têm se dedicado a investigar qual das vertentes teóricas acerca da estrutura de capitais melhor explica a prática corporativa, chegando freqüentemente à oposição entre as teorias de equilíbrio estático – quer baseado em aspectos fiscais, quer em custos de agência – e as calcadas em assimetria informacional. O presente trabalho, a partir de dados de empresas brasileiras listadas na Bolsa de Valores de São Paulo – BOVESPA, no período 1997-2002, utiliza-se de duas especificações distintas para modelos de regressão, cada qual mais adequada a uma das linhas teóricas citadas, com o intuito de apurar qual possui maior aderência à realidade brasileira e, com base nos resultados, que teoria possui maior força explicativa. Os resultados indicam, primeiramente, que a especificação baseada nas predições da hierarquização das fontes de financiamento apresenta um desempenho superior àquela mais voltada ao equilíbrio estático e, em segundo lugar, a análise dos resultados de ambas as especificações também é favorável à pecking order. Isto posto, conclui-se ser esta a teoria que melhor descreve o comportamento das empresas brasileiras em suas decisões sobre suas estruturas de capitais.