844 resultados para economics of search
A Modified inverse integer Cholesky decorrelation method and the performance on ambiguity resolution
Resumo:
One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.
Resumo:
The Future of Financial Regulation is an edited collection of papers presented at a major conference at the University of Glasgow in Spring 2009. It draws together a variety of different perspectives on the international financial crisis which began in August 2007 and later turned into a more widespread economic crisis following the collapse of Lehman Brothers in the Autumn of 2008. Spring 2009 was in many respects the nadir since valuations in financial markets had reached their low point and crisis management rather than regulatory reform was the main focus of attention. The conference and book were deliberately framed as an attempt to re-focus attention from the former to the latter. The first part of the book focuses on the context of the crisis, discussing the general characteristics of financial crises and the specific influences that were at work during this time. The second part focuses more specifically on regulatory techniques and practices implicated in the crisis, noting in particular an over-reliance on the capacity of regulators and financial institutions to manage risk and on the capacity of markets to self-correct. The third part focuses on the role of governance and ethics in the crisis and in particular the need for a common ethical framework to underpin governance practices and to provide greater clarity in the design of accountability mechanisms. The final part focuses on the trajectory of regulatory reform, noting the considerable potential for change as a result of the role of the state in the rescue and recuperation of the financial system and stressing the need for fundamental re-appraisal of business and regulatory models. This informative book will be of interest to financial regulators and theorists, commercial and financial law practitioners, and academics involved in the law and economics of regulation.
Resumo:
Lignocellulosic waste materials are the most promising feedstock for generation of a renewable, carbon-neutral substitute for existing liquid fuels. The development of value-added products from lignin will greatly improve the economics of producing liquid fuels from biomass. This review gives an outline of lignin chemistry, describes the current processes of lignocellulosic biomass fractionation and the lignin products obtained through these processes, then outlines current and potential value-added applications of these products, in particular as components of polymer composites. Research highlights The use of lignocellulosic biomass to produce platform chemicals and industrial products enhances the sustainability of natural resources and improves environmental quality by reducing greenhouse and toxic emissions. In addition, the development of lignin based products improves the economics producing liquid transportation fuel from lignocellulosic feedstock. Value adding can be achieved by converting lignin to functionally equivalent products that rely in its intrinsic properties. This review outlines lignin chemistry and some potential high value products that can be made from lignin. Keywords: Lignocellulose materials; Lignin chemistry; Application
Resumo:
The explanation of social inequalities in education is still a debated issue in economics. Recent empirical studies tend to downplay the potential role of credit constraint. This article tests a different potential explanation of social inequalities in education, specifically that social differences in aspiration level result in different educational choices. Having existed for a long time in the sociology of education, this explanation can be justified if aspiration levels are seen as reference points in a prospect theory framework. In order to test this explanation, this article applies the method of experimental economics to the issue of education choice and behaviour. One hundred and twenty-nine individuals participated in an experiment in which they had to perform a task over 15 stages grouped in three blocks or levels. In order to continue through the experiment, a minimum level of success was required at the end of each level. Rewards were dependent on the final level successfully reached. At the end of each level, participants could either choose to stop and take their reward or to pay a cost to continue further in order to possibly receive higher rewards. To test the impact of aspiration levels, outcomes were either presented as gains or losses relative to an initial sum. In accordance with the theoretical predictions, participants in the loss framing group choose to go further in the experiment. There was also a significant and interesting gender effect in the loss framing treatment, such that males performed better and reached higher levels.
Resumo:
Throughout the twentieth century the economics of the Middle East rose and fell many times in response to the external environment, including European de-colonization and the US and former USSR competing to provide military and economic aid after World War II. Throughout these upheavals the Middle East has remained internationally significant politically and economically not least for the region's large reserves of oil and gas, as discussed in the Introduction to this volume. In recent decades, Western nations have moved to invest into the Middle East in the rapidly developing technology, tourism and education industries that have proliferated. For its part, Iran has been the world's fourth largest provider of petroleum and second largest provider of natural gas and, despite years of political unrest, has made rapid expansion into information technology and telecommunications. Increased involvement in the global economy has meant that Iran has invested heavily in education and training and moved to modernize its management practices. Hitherto there has been little academic research into management in either Western or local organizations in Iran. This chapter seeks to address that gap in knowledge by exploring business leadership in Iran, with particular reference to cultural and institutional impacts.
Resumo:
Advances in information and communication technologies have brought about an information revolution, leading to fundamental changes in the way information is collected or generated, shared and distributed. The internet and digital technologies are re-shaping research, innovation and creativity. Economic research has highlighted the importance of information flows and the availability of information for access and re-use. Information is crucial to the efficiency of markets and enhanced information flows promote creativity, innovation and productivity. There is a rapidly expanding body of literature which supports the economic and social benefits of enabling access to and re-use of public sector information.1 (Note that a substantial research project associated with QUT’s Intellectual Property: Knowledge, Culture and Economy (IPKCE) Research Program is engaged in a comprehensive study and analysis of the literature on the economics of access to public sector information.)
Resumo:
This paper provides an analysis of why many ‘stars’ tend to fade away rather than enjoying ongoing branding advantages from their reputations. We propose a theory of market overshooting in creative industries that is based on Schumpeterian competition between producers to maintain the interest of boundedly rational fans. As creative producers compete by offering further artistic novelty, this escalation of product complexity eventually leads to overshooting. We propose this as a theory of endogenous cycles in the creative industries.
Resumo:
The standard approach to tax compliance applies the economics-of-crime methodology pioneered by Becker (1968): in its first application, due to Allingham and Sandmo (1972) it models the behaviour of agents as a decision involving a choice of the extent of their income to report to tax authorities, given a certain institutional environment, represented by parameters such as the probability of detection and penalties in the event the agent is caught. While this basic framework yields important insights on tax compliance behavior, it has some critical limitations. Specifically, it indicates a level of compliance that is significantly below what is observed in the data. This thesis revisits the original framework with a view towards addressing this issue, and examining the political economy implications of tax evasion for progressivity in the tax structure. The approach followed involves building a macroeconomic, dynamic equilibrium model for the purpose of examining these issues, by using a step-wise model building procedure starting with some very simple variations of the basic Allingham and Sandmo construct, which are eventually integrated to a dynamic general equilibrium overlapping generations framework with heterogeneous agents. One of the variations involves incorporating the Allingham and Sandmo construct into a two-period model of a small open economy of the type originally attributed to Fisher (1930). A further variation of this simple construct involves allowing agents to initially decide whether to evade taxes or not. In the event they decide to evade, the agents then have to decide the extent of income or wealth they wish to under-report. We find that the ‘evade or not’ assumption has strikingly different and more realistic implications for the extent of evasion, and demonstrate that it is a more appropriate modeling strategy in the context of macroeconomic models, which are essentially dynamic in nature, and involve consumption smoothing across time and across various states of nature. Specifically, since deciding to undertake tax evasion impacts on the consumption smoothing ability of the agent by creating two states of nature in which the agent is ‘caught’ or ‘not caught’, there is a possibility that their utility under certainty, when they choose not to evade, is higher than the expected utility obtained when they choose to evade. Furthermore, the simple two-period model incorporating an ‘evade or not’ choice can be used to demonstrate some strikingly different political economy implications relative to its Allingham and Sandmo counterpart. In variations of the two models that allow for voting on the tax parameter, we find that agents typically choose to vote for a high degree of progressivity by choosing the highest available tax rate from the menu of choices available to them. There is, however, a small range of inequality levels for which agents in the ‘evade or not’ model vote for a relatively low value of the tax rate. The final steps in the model building procedure involve grafting the two-period models with a political economy choice into a dynamic overlapping generations setting with more general, non-linear tax schedules and a ‘cost-of evasion’ function that is increasing in the extent of evasion. Results based on numerical simulations of these models show further improvement in the model’s ability to match empirically plausible levels of tax evasion. In addition, the differences between the political economy implications of the ‘evade or not’ version of the model and its Allingham and Sandmo counterpart are now very striking; there is now a large range of values of the inequality parameter for which agents in the ‘evade or not’ model vote for a low degree of progressivity. This is because, in the ‘evade or not’ version of the model, low values of the tax rate encourages a large number of agents to choose the ‘not-evade’ option, so that the redistributive mechanism is more ‘efficient’ relative to the situations in which tax rates are high. Some further implications of the models of this thesis relate to whether variations in the level of inequality, and parameters such as the probability of detection and penalties for tax evasion matter for the political economy results. We find that (i) the political economy outcomes for the tax rate are quite insensitive to changes in inequality, and (ii) the voting outcomes change in non-monotonic ways in response to changes in the probability of detection and penalty rates. Specifically, the model suggests that changes in inequality should not matter, although the political outcome for the tax rate for a given level of inequality is conditional on whether there is a large or small or large extent of evasion in the economy. We conclude that further theoretical research into macroeconomic models of tax evasion is required to identify the structural relationships underpinning the link between inequality and redistribution in the presence of tax evasion. The models of this thesis provide a necessary first step in that direction.
Resumo:
The suitability of Role Based Access Control (RBAC) is being challenged in dynamic environments like healthcare. In an RBAC system, a user's legitimate access may be denied if their need has not been anticipated by the security administrator at the time of policy specification. Alternatively, even when the policy is correctly specified an authorised user may accidentally or intentionally misuse the granted permission. The heart of the challenge is the intrinsic unpredictability of users' operational needs as well as their incentives to misuse permissions. In this paper we propose a novel Budget-aware Role Based Access Control (B-RBAC) model that extends RBAC with the explicit notion of budget and cost, where users are assigned a limited budget through which they pay for the cost of permissions they need. We propose a model where the value of resources are explicitly defined and an RBAC policy is used as a reference point to discriminate the price of access permissions, as opposed to representing hard and fast rules for making access decisions. This approach has several desirable properties. It enables users to acquire unassigned permissions if they deem them necessary. However, users misuse capability is always bounded by their allocated budget and is further adjustable through the discrimination of permission prices. Finally, it provides a uniform mechanism for the detection and prevention of misuses.
Resumo:
How to Improve Pupils' Literacy? A Cost-Effectiveness Analysis of the Action Lecture This article presents a cost-effectiveness analysis of an innovative teaching method run in some nursery and primary schools in Paris. This project, named Action Lecture, is designed to improve pupils' abilities and taste on literacy. We report the results of an evaluation of this program. We describe the experimental protocol that was built to estimate the program's impact on several types of indicators (academic abilities, attitude toward reading, school life) by comparing the evolution of treatment schools and control schools. Data were processed following a Differences-in-Differences (DID) method. Then we use the estimation of the impact on academic achievement to conduct a cost-effectiveness analysis and take a reduction of the class size program as a benchmark. The results are positive for the Action Lecture program both in terms of impact's evaluation and in terms of cost-effectiveness ratio.
Resumo:
In the electricity market environment, coordination of system reliability and economics of a power system is of great significance in determining the available transfer capability (ATC). In addition, the risks associated with uncertainties should be properly addressed in the ATC determination process for risk-benefit maximization. Against this background, it is necessary that the ATC be optimally allocated and utilized within relative security constraints. First of all, the non-sequential Monte Carlo stimulation is employed to derive the probability density distribution of ATC of designated areas incorporating uncertainty factors. Second, on the basis of that, a multi-objective optimization model is formulated to determine the multi-area ATC so as to maximize the risk-benefits. Then, the solution to the developed model is achieved by the fast non-dominated sorting (NSGA-II) algorithm, which could decrease the risk caused by uncertainties while coordinating the ATCs of different areas. Finally, the IEEE 118-bus test system is served for demonstrating the essential features of the developed model and employed algorithm.
Resumo:
The economics of supporting learning has seen institutional encouragement of a wide range of blended learning initiatives in face to face and online teaching and learning. This has become one of the key drivers for the adoption of technology in teaching, in a manner occassionally guilty of putting the cart before the horse. Learning spaces are increasingly equipped with a dizzying array of technological options testifying to institutional and governmental investment and commitment in supporting face to face blended learning (QUT, 2011, C/4.2). Yet innovation within traditional learning and teaching models faces a number of challenges both at an institutional level and at the teaching coal face. Web 2.0 technologies present a vast array of opportunities to harness and capture the attention of students in engaging learning opportunitites. This presentation will explore technologies supportive of active learning pedagogies.
Resumo:
With the growing size and variety of social media files on the web, it’s becoming critical to efficiently organize them into clusters for further processing. This paper presents a novel scalable constrained document clustering method that harnesses the power of search engines capable of dealing with large text data. Instead of calculating distance between the documents and all of the clusters’ centroids, a neighborhood of best cluster candidates is chosen using a document ranking scheme. To make the method faster and less memory dependable, the in-memory and in-database processing are combined in a semi-incremental manner. This method has been extensively tested in the social event detection application. Empirical analysis shows that the proposed method is efficient both in computation and memory usage while producing notable accuracy.