817 resultados para Economics theories
Resumo:
Theories of individual attitudes toward IT include task technology fit (TTF), technology acceptance model (TAM), unified theory of acceptance and use of technology (UTAUT), cognitive fit, expectation disconfirmation, and computer self-efficacy. Examination of these theories reveals three main concerns. First, the theories mostly ‘‘black box’’ (or omit) the IT artifact. Second, appropriate mid-range theory is not developed to contribute to disciplinary progress and to serve the needs of our practitioner community. Third, theories are overlapping but incommensurable. We propose a theoretical framework that harmonizes these attitudinal theories and shows how they can be specialized to include relevant IS phenomenon.
Resumo:
Youth misuse of fire is a substantive community concern. Despite evidence which indicates youths account for a significant proportion of all deliberately lit fires within Australia, an absence of up-to-date, contextually specific research means the exact scope and magnitude of youth misuse of fire within Australia remains unknown. Despite research suggesting com- monalities exist between youth misuse of fire and juvenile offending more broadly, misuse of fire is rarely explained using criminological theory. In light of this gap, a descriptive analysis of youth misuse of fire within New South Wales was performed. Routine Activity Theory and Crime Pattern Theory were tested to explain differences in misuse of fire across age groups. Results suggest these environmental theories offer useful frameworks for explaining youth misuse of fire in New South Wales. It is argued that the Routine Activity Theory and Crime Pattern Theory can be employed to better inform youth misuse of fire policy and prevention efforts.
Resumo:
In this symposium I will discuss my work on three parallel projects I have been working on since 2010: Theories of Modernity & the Subject-Critique of contemporary architecture under ideological capitalism-and Fascism & Modern Architecture
Resumo:
A novel test of recent theories of the origin of optical activity has been designed based on the inclusion of certain alkyl 2-methylhexanoates into urea channels.
Resumo:
"In Perpetual Motion is an "historical choreography" of power, pedagogy, and the child from the 1600s to the early 1900s. It breaks new ground by historicizing the analytics of power and motion that have interpenetrated renditions of the young. Through a detailed examination of the works of John Locke, Jean-Jacques Rousseau, Johann Herbart, and G. Stanley Hall, this book maps the discursive shifts through which the child was given a unique nature, inscribed in relation to reason, imbued with an effectible interiority, and subjected to theories of power and motion. The book illustrates how developmentalist visions took hold in U.S. public school debates. It documents how particular theories of power became submerged and taken for granted as essences inside the human subject. In Perpetual Motion studiously challenges views of power as in or of the gaze, tracing how different analytics of power have been used to theorize what gazing could notice."--BOOK JACKET.
Resumo:
There are essentially two different phenomenological models available to describe the interdiffusion process in binary systems in the olid state. The first of these, which is used more frequently, is based on the theory of flux partitioning. The second model, developed much more recently, uses the theory of dissociation and reaction. Although the theory of flux partitioning has been widely used, we found that this theory does not account for the mobility of both species and therefore is not suitable for use in most interdiffusion systems. We have first modified this theory to take into account the mobility of both species and then further extended it to develop relations or the integrated diffusion coefficient and the ratio of diffusivities of the species. The versatility of these two different models is examined in the Co-Si system with respect to different end-member compositions. From our analysis, we found that the applicability of the theory of flux partitioning is rather limited but the theory of dissociation and reaction can be used in any binary system.
Resumo:
This paper empirically examines the effect of current tax policy on home ownership, specifically looking at how developer contributions impact house prices. Developer contributions are a commonly used mechanism for local governments to pay for new urban infrastructure. This research applies a hedonic house price model to 4,699 new and 25,053 existing house sales in Brisbane from 2005 to 2011. The findings of is research are consistent with international studies that support the proposition that developer contributions are over passed. This study has provided evidence that suggest developer contributions are over passed to both new and existing homes in the order of around 400%. These findings suggest that developer contributions are thus a significant contributor to increasing house prices, reduced housing supply and are thus an inefficient and inequitable tax. By testing this effect on both new and existing homes, this research provides evidence in support of the proposition that not only are developer contributions over passed to new home buyers but also to buyers of existing homes. Thus the price inflationary effect of these developer contributions are being felt by all home buyers across the community, resulting in increased mortgage repayments of close to $1,000 per month in Australia. This is the first study to empirically examine the impact of developer contributions on house prices in Australia. These results are important as they inform governments on the outcomes of current tax policy on home ownership, providing the first evidence of its kind in Australia. This is an important contribution to the tax reform agenda in Australia.
Resumo:
This submission addresses the problem of housing price inflation, the chronic under-supply of new housing stock, and the resultant decline in housing affordability for low and middle income households. It specifically focusses on the supply of medium density housing (multi-unit development) in Melbourne, although we believe that the observations made about housing in supply in Melbourne are relevant in other urban centres and to other types of housing supply. In terms of medium density housing (MDH) our concern also extends to the poor quality and design. Why the market tends to deliver generic apartments of poor quality and design which are uncompetitive with lower density housing and amenity despite planning objectives, and how this apparently intractable problem can be overcome is the topic of this submission...
Resumo:
Reduced economic circumstances havemoved management goals towards higher profit, rather than maximum sustainable yields in several Australian fisheries. The eastern king prawn is one such fishery, for which we have developed new methodology for stock dynamics, calculation of model-based and data-based reference points and management strategy evaluation. The fishery is notable for the northward movement of prawns in eastern Australian waters, from the State jurisdiction of New South Wales to that of Queensland, as they grow to spawning size, so that vessels fishing in the northern deeper waters harvest more large prawns. Bioeconomic fishing data were standardized for calibrating a length-structured spatial operating model. Model simulations identified that reduced boat numbers and fishing effort could improve profitability while retaining viable fishing in each jurisdiction. Simulations also identified catch rate levels that were effective for monitoring in simple within-year effort-control rules. However, favourable performance of catch rate indicators was achieved only when a meaningful upper limit was placed on total allowed fishing effort. Themethods and findings will allow improved measures for monitoring fisheries and inform decision makers on the uncertainty and assumptions affecting economic indicators.
Resumo:
We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.
Resumo:
We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.
Resumo:
This study examines both theoretically an empirically how well the theories of Norman Holland, David Bleich, Wolfgang Iser and Stanley Fish can explain readers' interpretations of literary texts. The theoretical analysis concentrates on their views on language from the point of view of Wittgenstein's Philosophical Investigations. This analysis shows that many of the assumptions related to language in these theories are problematic. The empirical data show that readers often form very similar interpretations. Thus the study challenges the common assumption that literary interpretations tend to be idiosyncratic. The empirical data consists of freely worded written answers to questions on three short stories. The interpretations were made by 27 Finnish university students. Some of the questions addressed issues that were discussed in large parts of the texts, some referred to issues that were mentioned only in passing or implied. The short stories were "The Witch à la Mode" by D. H. Lawrence, "Rain in the Heart" by Peter Taylor and "The Hitchhiking Game" by Milan Kundera. According to Fish, readers create both the formal features of a text and their interpretation of it according to an interpretive strategy. People who agree form an interpretive community. However, a typical answer usually contains ideas repeated by several readers as well as observations not mentioned by anyone else. Therefore it is very difficult to determine which readers belong to the same interpretive community. Moreover, readers with opposing opinions often seem to pay attention to the same textual features and even acknowledge the possibility of an opposing interpretation; therefore they do not seem to create the formal features of the text in different ways. Iser suggests that an interpretation emerges from the interaction between the text and the reader when the reader determines the implications of the text and in this way fills the "gaps" in the text. Iser believes that the text guides the reader, but as he also believes that meaning is on a level beyond words, he cannot explain how the text directs the reader. The similarity in the interpretations and the fact that the agreement is strongest when related to issues that are discussed broadly in the text do, however, support his assumption that readers are guided by the text. In Bleich's view, all interpretations have personal motives and each person has an idiosyncratic language system. The situation where a person learns a word determines the most important meaning it has for that person. In order to uncover the personal etymologies of words, Bleich asks his readers to associate freely on the basis of a text and note down all the personal memories and feelings that the reading experience evokes. Bleich's theory of the idiosyncratic language system seems to rely on a misconceived notion of the role that ostensive definitions have in language use. The readers' responses show that spontaneous associations to personal life seem to colour the readers' interpretations, but such instances are rather rare. According to Holland, an interpretation reflects the reader's identity theme. Language use is regulated by shared rules, but everyone follows the rules in his or her own way. Words mean different things to different people. The problem with this view is that if there is any basis for language use, it seems to be the shared way of following linguistic rules. Wittgenstein suggests that our understanding of words is related to the shared ways of using words and our understanding of human behaviour. This view seems to give better grounds for understanding similarity and differences in literary interpretations than the theories of Holland, Bleich, Fish and Iser.
Resumo:
It is shown that the Fayet-Illiopoulos D term in N= 1 supersymmetric spontaneously broken U( 1) gauge theories may get one-loop corrections, even when trace U( 1) charges are zero. However, these corrections are only logarithmically divergent and hence do not affect the naturalness of the theory.
Resumo:
This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.