14 resultados para Teoria de pecking order

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to investigate the influence of the asset class and the breakdown of tangibility as determinant factors of the capital structure of companies listed on the BM & FBOVESPA in the period of 2008-2012. Two current assets classes were composed and once they were grouped by liquidity, they were also analyzed by the financial institutions for credit granting: current resources (Cash, Bank and Financial Applications) and operations with duplicates (Stocks and Receivables). The breakdown of the tangible assets was made based on its main components provided as warrantees for loans like Machinery & Equipment and Land & Buildings. For an analysis extension, three metrics for leverage (accounting, financial and market) were applied and the sample was divided into economic sectors, adopted by BM&FBOVESPA. The data model in dynamic panel estimated by a systemic GMM of two levels was used in this study due its strength to problems of endogenous relationship as well as the omitted variables bias. The found results suggest that current resources are determinants of the capital structure possibly because they re characterized as proxies for financial solvency, being its relationship with debt positive. The sectorial analysis confirmed the results for current resources. The tangibility of assets has inverse proportional relationship with the leverage. As it is disintegrated in its main components, the significant and negative influence of machinery & equipment was more marked in the Industrial Goods sector. This result shows that, on average, the most specific assets from operating activities of a company compete for a less use of third party resources. As complementary results, it was observed that the leverage has persistence, which is linked with the static trade-off theory. Specifically for financial leverage, it was observed that the persistence is relevant when it is controlled for the lagged current assets classes variables. The proxy variable for growth opportunities, measured by the Market -to -Book, has the sign of its contradictory coefficient. The company size has a positive relationship with debt, in favor of static trade-off theory. Profitability is the most consistent variable in all the performed estimations, showing strong negative and significant relationship with leverage, as the pecking order theory predicts

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to investigate the influence of the asset class and the breakdown of tangibility as determinant factors of the capital structure of companies listed on the BM & FBOVESPA in the period of 2008-2012. Two current assets classes were composed and once they were grouped by liquidity, they were also analyzed by the financial institutions for credit granting: current resources (Cash, Bank and Financial Applications) and operations with duplicates (Stocks and Receivables). The breakdown of the tangible assets was made based on its main components provided as warrantees for loans like Machinery & Equipment and Land & Buildings. For an analysis extension, three metrics for leverage (accounting, financial and market) were applied and the sample was divided into economic sectors, adopted by BM&FBOVESPA. The data model in dynamic panel estimated by a systemic GMM of two levels was used in this study due its strength to problems of endogenous relationship as well as the omitted variables bias. The found results suggest that current resources are determinants of the capital structure possibly because they re characterized as proxies for financial solvency, being its relationship with debt positive. The sectorial analysis confirmed the results for current resources. The tangibility of assets has inverse proportional relationship with the leverage. As it is disintegrated in its main components, the significant and negative influence of machinery & equipment was more marked in the Industrial Goods sector. This result shows that, on average, the most specific assets from operating activities of a company compete for a less use of third party resources. As complementary results, it was observed that the leverage has persistence, which is linked with the static trade-off theory. Specifically for financial leverage, it was observed that the persistence is relevant when it is controlled for the lagged current assets classes variables. The proxy variable for growth opportunities, measured by the Market -to -Book, has the sign of its contradictory coefficient. The company size has a positive relationship with debt, in favor of static trade-off theory. Profitability is the most consistent variable in all the performed estimations, showing strong negative and significant relationship with leverage, as the pecking order theory predicts

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present time, public organizations are employing more and more solutions that uses information technology in order to ofer more transparency and better services for all citizens. Integrated Systems are IT which carry in their kernel features of integration and the use of a unique database. These systems bring several benefits and face some obstacles that make their adoption difficult. The conversion to a integrated system may take years and, thus, the study of the adoption of this IT in public sector organizations become very stimulant due to some peculiarities of this sector and the features of this technology. First of all, information about the particular integrated system in study and about its process of conversion are offered. Then, the researcher designs the configuration of the conversion process aim of this study the agents envolved and the moments and the tools used to support the process in order to elaborate the methodology of the conversion process understood as the set of procedures and tools used during all the conversion process. After this, the researcher points out, together with all the members of the conversion team, the negative and positive factors during the project. Finally, these factors were analysed through the Hospitality Theory lens which, in the researcher opinion, was very useful to understand the elements, events and moments that interfered in the project. The results consolidated empirically the Hospitality Theory presumptions, showing yet a limitation of this theory in the case in study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this work is to approach and understand the Social Representations (SR) (MOSCOVICI, 2003) about Physics and Chemistry from people who are major in these courses, as well as their Social Representations about teaching . We took as principle that approaching these representations it would be possible to relate their symbolic contents, in order to show how people who are following the first segments of bachelor degree courses in Physics and Chemistry become teachers, taking into account a psychosocial view. Two source of data was used during this research: Free-association Technique FA (ABRIC, 1994); and Multiple Classification Procedure (MCP) (ROAZZI, 1995). The analytical treatment of the collected data from FA was done according to the proposition of Grize, Vergés and Silem (1987 apud ABRIC, 1994, p. 66). MCP data were analyzed through MSA (Multidimensional Scalogram Analysis) and SSA (Singular Spectrum Analysis) methods associated with the Facet Theory (BILSKY, 2003). The discourses of MCP discussing groups at the moment of explanations were studied by Content Analysis as it was proposed by Bardin (1977) and Franco (2005). Indicative of an approach to the relations with knowledge (CHARLOT, 2000), the connections which aroused from the analyses showed that the group of future majors in Physics thought that this scientific field was based on a rationalist conception, influencing the idealization sense of the phenomena to be explained by Physics. Thus, Physics as school content started to require the student of the fundamental and high school to think abstractly as a cognitive skill of learning. The identifying elements observed in the relations between SR about Physics and Teaching aroused from the antagonism between future majors and their teacher, as well as from the speculation between their fundamental and high school students and themselves, mainly when they had to face the act of teaching due to the obstacles imposed by the own educational system, and by the weakness of the initial preparation. The group of future majors in Chemistry, through its discourses, showed these relations when they conceived empiricist Chemistry and said that teaching was the way of transmission of this knowledge, and didactics of Chemistry teaching was the direction to learning through pedagogic methods in order to lead the students to discoveries. The psychosocial contents which were built and showed from the symbolic relations in the studied SR achieved the relation of identity. This relation revealed identifying elements for these people, resulting from the traffic between their condition as students of Chemistry, and as teachers regarding their work, what placed the current relational contents in the teaching space, named as Knowledge changing and Adaptability . In order to study emerging questions in the discussing environment about formation and teaching professionalization, we focused the psychosocial view on this traffic and managed to observe epistemological practical and pedagogic obstacles that limited a configuration of the teaching work as a professional activity, especially from the particular conditions which led the relations of senses to Physics , Chemistry and Teaching ; and Chemistry and Physics as it was seen in this research. Generally speaking, we noted that these obstacles can denounce such obstacles concerning to the pedagogic doings which mainly impair the learning process of fundamental and high school students

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been remarkable among the Science Teaching debates the necessity that students do not learn only theories, laws and concepts, but also develop skills which allows them to act towards a critical citizenship. Therefore, some of the skills for the natural sciences learning must be taught consciously, intentionally and in a planned way, as component of a basic competence. Studies of the last twenty years have shown that students and teachers have plenty of difficulties about skills development and, among several, the skill of interpreting Cartesian graphics, essential for the comprehension of Natural Science. In that sense, the development of that type of professional knowledge during the initial education of future Chemistry teachers has become strategic, not only because they need to know how to use it, but also because they need to know how to teach it. This research has as its general objective the organization, development and study of a process of formation of the skill of interpreting Cartesian graphics as part of the teachers professional knowledge. It has been accomplished through a formative experience with six undergraduate students of the Teaching Degree Course of Chemistry of Universidade Federal do Rio Grande do Norte (UFRN Federal University of Rio Grande do Norte), in Brazil. In order to develop that skill, we have used as reference P. Ya. Galperin s Theory of the Stepwise Formation of Mental Actions and Concepts and its following qualitative indicators: action form, degree of generalization, degree of consciousness, degree of independence and degree of solidness. The research, in a qualitative approach, has prioritized as instruments of data collecting the registering of the activities of the undergraduate students, the observation, the questionnaire and the diagnosis tests. At the first moment, a teaching framework has been planned for the development of the skill of interpreting Cartesian graphics based on the presupposed conceptions and steps of Galperin s Theory. At the second moment, the referred framework has been applied and the process of the skill formation has been studied. The results have shown the possibility of develop the skill conscious about the invariant operation system, with a high degree of generalization and internalized the operational invariant in the mental plane. The students have attested the contributions at that type of formative experience. The research reveals the importance of going deeper about the teaching comprehension of the individualities tied to the process of internalization, according to Galperin s Theory, when the update of abilities as part of the teaching professional knowledge is the issue

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study seeks to present a historico-epistemological analysis of the development of the mathematical concept of negative number. In order to do so, we analyzed the different forms and conditions of the construction of mathematical knowledge in different mathematical communities and, thus, identified the characteristics in the establishment of this concept. By understanding the historically constructed barriers, especially, the ones having ontologicas significant, that made the concept of negative number incompatible with that of natural number, thereby hindering the development of the concept of negative, we were able to sketch the reasons for the rejection of negative numbers by the English author Peter Barlow (1776 -1862) in his An Elementary Investigation of the Theory of Numbers, published in 1811. We also show the continuity of his difficulties with the treatment of negative numbers in the middle of the nineteenth century

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The infographics historically experience the process of evolution of journalism, from the incipient models handmade in the eighteenth century to the inclusion of computers and sophisticated software today. In order to face the advent of TV against of the partiality readers of the printed newspaper, or to represent the Gulf War, where not allowed photography, infographics reaches modern levels of production and publication. The technical devices which enabled the infographics to evolve the environment of the internet, with conditions for the manipulation of the reader, incorporating video, audio and animations, so styling of interactive infographics. These digital models of information visualization recently arrived daily in the northeast and on their respective web sites with features regionalized. This paper therefore proposes to explore and describe the processes of producing the interactive infographics, taking the example of the Diário do Nordeste, Fortaleza, Ceará, whose department was created one year ago. Therefore, based on aspects that guide the theory of journalism, as newsmaking, filters that focus on productive routine (gatekeeping) and the construction stages of the news. This research also draws on the theoretical framework on the subject, in concepts essential characteristics of computer graphics, as well as the methodological procedures and systematic empirical observations in production routines of the newsroom who can testify limitations and / or advances

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following work is to interpret and analyze the problem of induction under a vision founded on set theory and probability theory as a basis for solution of its negative philosophical implications related to the systems of inductive logic in general. Due to the importance of the problem and the relatively recent developments in these fields of knowledge (early 20th century), as well as the visible relations between them and the process of inductive inference, it has been opened a field of relatively unexplored and promising possibilities. The key point of the study consists in modeling the information acquisition process using concepts of set theory, followed by a treatment using probability theory. Throughout the study it was identified as a major obstacle to the probabilistic justification, both: the problem of defining the concept of probability and that of rationality, as well as the subtle connection between the two. This finding called for a greater care in choosing the criterion of rationality to be considered in order to facilitate the treatment of the problem through such specific situations, but without losing their original characteristics so that the conclusions can be extended to classic cases such as the question about the continuity of the sunrise

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present a proposal to contribute to the teaching and learning of affine function in the first year of high school having as prerequisite mathematical knowledge of basic education. The proposal focuses on some properties, special cases and applications of affine functions in order to show the importance of the demonstrations while awaken student interest by showing how this function is important to solve everyday problems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Event-B is a formal method for modeling and verification of discrete transition systems. Event-B development yields proof obligations that must be verified (i.e. proved valid) in order to keep the produced models consistent. Satisfiability Modulo Theory solvers are automated theorem provers used to verify the satisfiability of logic formulas considering a background theory (or combination of theories). SMT solvers not only handle large firstorder formulas, but can also generate models and proofs, as well as identify unsatisfiable subsets of hypotheses (unsat-cores). Tool support for Event-B is provided by the Rodin platform: an extensible Eclipse based IDE that combines modeling and proving features. A SMT plug-in for Rodin has been developed intending to integrate alternative, efficient verification techniques to the platform. We implemented a series of complements to the SMT solver plug-in for Rodin, namely improvements to the user interface for when proof obligations are reported as invalid by the plug-in. Additionally, we modified some of the plug-in features, such as support for proof generation and unsat-core extraction, to comply with the SMT-LIB standard for SMT solvers. We undertook tests using applicable proof obligations to demonstrate the new features. The contributions described can potentially affect productivity in a positive manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present indefinite integration algorithms for rational functions over subfields of the complex numbers, through an algebraic approach. We study the local algorithm of Bernoulli and rational algorithms for the class of functions in concern, namely, the algorithms of Hermite; Horowitz-Ostrogradsky; Rothstein-Trager and Lazard-Rioboo-Trager. We also study the algorithm of Rioboo for conversion of logarithms involving complex extensions into real arctangent functions, when these logarithms arise from the integration of rational functions with real coefficients. We conclude presenting pseudocodes and codes for implementation in the software Maxima concerning the algorithms studied in this work, as well as to algorithms for polynomial gcd computation; partial fraction decomposition; squarefree factorization; subresultant computation, among other side algorithms for the work. We also present the algorithm of Zeilberger-Almkvist for integration of hyperexpontential functions, as well as its pseudocode and code for Maxima. As an alternative for the algorithms of Rothstein-Trager and Lazard-Rioboo-Trager, we yet present a code for Benoulli’s algorithm for square-free denominators; and another for Czichowski’s algorithm, although this one is not studied in detail in the present work, due to the theoretical basis necessary to understand it, which is beyond this work’s scope. Several examples are provided in order to illustrate the working of the integration algorithms in this text

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dark matter is a fundamental ingredient of the modern Cosmology. It is necessary in order to explain the process of structures formation in the Universe, rotation curves of galaxies and the mass discrepancy in clusters of galaxies. However, although many efforts, in both aspects, theoretical and experimental, have been made, the nature of dark matter is still unknown and the only convincing evidence for its existence is gravitational. This rises doubts about its existence and, in turn, opens the possibility that the Einstein’s gravity needs to be modified at some scale. We study, in this work, the possibility that the Eddington-Born-Infeld (EBI) modified gravity provides en alternative explanation for the mass discrepancy in clusters of galaxies. For this purpose we derive the modified Einstein field equations and find their solutions to a spherical system of identical and collisionless point particles. Then, we took into account the collisionless relativistic Boltzmann equation and using some approximations and assumptions for weak gravitational field, we derived the generalized virial theorem in the framework of EBI gravity. In order to compare the predictions of EBI gravity with astrophysical observations we estimated the order of magnitude of the geometric mass, showing that it is compatible with present observations. Finally, considering a power law for the density of galaxies in the cluster, we derived expressions for the radial velocity dispersion of the galaxies, which can be used for testing some features of the EBI gravity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dark matter is a fundamental ingredient of the modern Cosmology. It is necessary in order to explain the process of structures formation in the Universe, rotation curves of galaxies and the mass discrepancy in clusters of galaxies. However, although many efforts, in both aspects, theoretical and experimental, have been made, the nature of dark matter is still unknown and the only convincing evidence for its existence is gravitational. This rises doubts about its existence and, in turn, opens the possibility that the Einstein’s gravity needs to be modified at some scale. We study, in this work, the possibility that the Eddington-Born-Infeld (EBI) modified gravity provides en alternative explanation for the mass discrepancy in clusters of galaxies. For this purpose we derive the modified Einstein field equations and find their solutions to a spherical system of identical and collisionless point particles. Then, we took into account the collisionless relativistic Boltzmann equation and using some approximations and assumptions for weak gravitational field, we derived the generalized virial theorem in the framework of EBI gravity. In order to compare the predictions of EBI gravity with astrophysical observations we estimated the order of magnitude of the geometric mass, showing that it is compatible with present observations. Finally, considering a power law for the density of galaxies in the cluster, we derived expressions for the radial velocity dispersion of the galaxies, which can be used for testing some features of the EBI gravity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present time, public organizations are employing more and more solutions that uses information technology in order to ofer more transparency and better services for all citizens. Integrated Systems are IT which carry in their kernel features of integration and the use of a unique database. These systems bring several benefits and face some obstacles that make their adoption difficult. The conversion to a integrated system may take years and, thus, the study of the adoption of this IT in public sector organizations become very stimulant due to some peculiarities of this sector and the features of this technology. First of all, information about the particular integrated system in study and about its process of conversion are offered. Then, the researcher designs the configuration of the conversion process aim of this study the agents envolved and the moments and the tools used to support the process in order to elaborate the methodology of the conversion process understood as the set of procedures and tools used during all the conversion process. After this, the researcher points out, together with all the members of the conversion team, the negative and positive factors during the project. Finally, these factors were analysed through the Hospitality Theory lens which, in the researcher opinion, was very useful to understand the elements, events and moments that interfered in the project. The results consolidated empirically the Hospitality Theory presumptions, showing yet a limitation of this theory in the case in study