129 resultados para happiness, utility functions, correlation analysis, personal income, economic models
em Queensland University of Technology - ePrints Archive
Relative income, happiness, and utility : an explanation for the Easterlin paradox and other puzzles
Resumo:
The well-known Easterlin paradox points out that average happiness has remained constant over time despite sharp rises in GNP per head. At the same time, a micro literature has typically found positive correlations between individual income and individual measures of subjective well-being. This paper suggests that these two findings are consistent with the presence of relative income terms in the utility function. Income may be evaluated relative to others (social comparison) or to oneself in the past (habituation). We review the evidence on relative income from the subjective well-being literature. We also discuss the relation (or not) between happiness and utility, and discuss some nonhappiness research (behavioral, experimental, neurological) related to income comparisons. We last consider how relative income in the utility function can affect economic models of behavior in the domains of consumption, investment, economic growth, savings, taxation, labor supply, wages, and migration.
Resumo:
Spectroscopic studies of complex clinical fluids have led to the application of a more holistic approach to their chemical analysis becoming more popular and widely employed. The efficient and effective interpretation of multidimensional spectroscopic data relies on many chemometric techniques and one such group of tools is represented by so-called correlation analysis methods. Typical of these techniques are two-dimensional correlation analysis and statistical total correlation spectroscopy (STOCSY). Whilst the former has largely been applied to optical spectroscopic analysis, STOCSY was developed and has been applied almost exclusively to NMR metabonomic studies. Using a 1H NMR study of human blood plasma, from subjects recovering from exhaustive exercise trials, the basic concepts and applications of these techniques are examined. Typical information from their application to NMR-based metabonomics is presented and their value in aiding interpretation of NMR data obtained from biological systems is illustrated. Major energy metabolites are identified in the NMR spectra and the dynamics of their appearance and removal from plasma during exercise recovery are illustrated and discussed. The complementary nature of two-dimensional correlation analysis and statistical total correlation spectroscopy are highlighted.
Resumo:
Acoustic sensors play an important role in augmenting the traditional biodiversity monitoring activities carried out by ecologists and conservation biologists. With this ability however comes the burden of analysing large volumes of complex acoustic data. Given the complexity of acoustic sensor data, fully automated analysis for a wide range of species is still a significant challenge. This research investigates the use of citizen scientists to analyse large volumes of environmental acoustic data in order to identify bird species. Specifically, it investigates ways in which the efficiency of a user can be improved through the use of species identification tools and the use of reputation models to predict the accuracy of users with unidentified skill levels. Initial experimental results are reported.
Resumo:
Non-invasive vibration analysis has been used extensively to monitor the progression of dental implant healing and stabilization. It is now being considered as a method to monitor femoral implants in transfemoral amputees. This paper evaluates two modal analysis excitation methods and investigates their capabilities in detecting changes at the interface between the implant and the bone that occur during osseointegration. Excitation of bone-implant physical models with the electromagnetic shaker provided higher coherence values and a greater number of modes over the same frequency range when compared to the impact hammer. Differences were detected in the natural frequencies and fundamental mode shape of the model when the fit of the implant was altered in the bone. The ability to detect changes in the model dynamic properties demonstrates the potential of modal analysis in this application and warrants further investigation.
Resumo:
This research paper aims to develop a method to explore the travel behaviour differences between disadvantaged and non-disadvantaged populations. It also aims to develop a modelling approach or a framework to integrate disadvantage analysis into transportation planning models (TPMs). The methodology employed identifies significantly disadvantaged groups through a cluster analysis and the paper presents a disadvantage-integrated TPM. This model could be useful in determining areas with concentrated disadvantaged population and also developing and formulating relevant disadvantage sensitive policies. (a) For the covering entry of this conference, please see ITRD abstract no. E214666.
Resumo:
The importance of actively managing and analysing business processes is acknowledged more than ever in organisations nowadays. Business processes form an essential part of an organisation and their application areas are manifold. Most organisations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyses and visualises a variety of performance metrics using a process definition and its corresponding execution logs. The replayer uses a YAWL process model example to demonstrate its capacity to support advanced language constructs.
Resumo:
The importance of actively managing and analyzing business processes is acknowledged more than ever in organizations nowadays. Business processes form an essential part of an organization and their ap-plication areas are manifold. Most organizations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyzes and visualizes a variety of performance metrics using a process definition and its execution logs. Performing performance analysis on existing and planned process models offers a great way for organizations to detect bottlenecks within their processes and allow them to make more effective process improvement decisions. Our technique is applied to processes modeled in the YAWL language. Execution logs of process instances are compared against the corresponding YAWL process model and replayed in a robust manner, taking into account any noise in the logs. Finally, performance characteristics, obtained from replaying the log in the model, are projected onto the model.
Resumo:
Traditional crash prediction models, such as generalized linear regression models, are incapable of taking into account the multilevel data structure, which extensively exists in crash data. Disregarding the possible within-group correlations can lead to the production of models giving unreliable and biased estimates of unknowns. This study innovatively proposes a -level hierarchy, viz. (Geographic region level – Traffic site level – Traffic crash level – Driver-vehicle unit level – Vehicle-occupant level) Time level, to establish a general form of multilevel data structure in traffic safety analysis. To properly model the potential cross-group heterogeneity due to the multilevel data structure, a framework of Bayesian hierarchical models that explicitly specify multilevel structure and correctly yield parameter estimates is introduced and recommended. The proposed method is illustrated in an individual-severity analysis of intersection crashes using the Singapore crash records. This study proved the importance of accounting for the within-group correlations and demonstrated the flexibilities and effectiveness of the Bayesian hierarchical method in modeling multilevel structure of traffic crash data.
Resumo:
Developers and policy makers are consistently at odds over the debate as to whether impact fees increase house prices. This debate continues despite the extensive body of theoretical and empirical international literature that discusses the passing on to home buyers of impact fees, and the corresponding increase to housing prices. In attempting to quantify this impact, over a dozen empirical studies have been carried out in the US and Canada since the 1980’s. However the methodologies used vary greatly, as do the results. Despite similar infrastructure funding policies in numerous developed countries, no such empirical works exist outside of the US/Canada. The purpose of this research is to analyse the existing econometric models in order to identify, compare and contrast the theoretical bases, methodologies, key assumptions and findings of each. This research will assist in identifying if further model development is required and/or whether any of these models have external validity and are readily transferable outside of the US. The findings conclude that there is very little explicit rationale behind the various model selections and that significant model deficiencies appear still to exist.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.
Resumo:
A range of authors from the risk management, crisis management, and crisis communications literature have proposed different models as a means of understanding components of crisis. A generic component of these sources has focused on preparedness practices before disturbance events and response practices during events. This paper provides a critical analysis of three key explanatory models of how crises escalate highlighting the strengths and limitations of each approach. The paper introduces an optimised conceptual model utilising components from the previous work under the four phases of pre-event, response, recovery, and post-event. Within these four phases, a ten step process is introduced that can enhance understanding of the progression of distinct stages of disturbance for different types of events. This crisis evolution framework is examined as a means to provide clarity and applicability to a range of infrastructure failure contexts and provide a path for further empirical investigation in this area.
Resumo:
Using established strategic management and business model frameworks we map the evolution of universities in the context of their value proposition to students as consumers of their products. We argue that in the main universities over time have transitioned from a value-based business model through to an efficiency-based business model that for numerous reasons, is becoming rapidly unsustainable. We further argue that the future university business models would benefit with a reconfiguration towards a network value based model. This approach requires a revised set of perceived benefits, better aligned to the current and future expectations and an alternate approach to the delivery of those benefits to learner / consumers.
Resumo:
Despite the advances that have been made in relation to the valuation of commercial, industrial and retail property, there has not been the same progress in relation to the valuation of rural property. Although the majority of rural property valuations also require the valuer to carry out a full analysis of the economic performance of the farming operations, this information is rarely used to assess the value of the property, nor is it even used for a secondary valuation method. Over the past 20 years the nature of rural valuation practice has required rural valuers to undertake studies in both agriculture (farm management) and valuation, especially if carrying out valuation work for financial institutions. The additional farm financial information obtained by rural valuers exceeds that level of information required to value commercial, retail and industrial by the capitalisation of net rent/profit valuation method and is very similar to the level of information required for the valuation of commercial and retail property by the Discounted Cash Flow valuation method. On this basis the valuers specialising in rural valuation practice should have the necessary skills and information to value rural properties by an income valuation method. Although the direct comparison method of valuation has been sufficient in the past to value rural properties the future use of the method as the main valuation method is limited and valuers need to adopt an income valuation method as at least a secondary valuation method to overcome the problems associated with the use of direct comparison as the only rural property valuation method. This paper will review the results of an extensive survey carried out by rural property valuers in New South Wales (NSW), Australia, in relation to the impact of farm management on rural property values and rural property income potential.