947 resultados para market segmentation theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with inflation theory, focussing on the model of Jarrow & Yildirim, which is nowadays used when pricing inflation derivatives. After recalling main results about short and forward interest rate models, the dynamics of the main components of the market are derived. Then the most important inflation-indexed derivatives are explained (zero coupon swap, year-on-year, cap and floor), and their pricing proceeding is shown step by step. Calibration is explained and performed with a common method and an heuristic and non standard one. The model is enriched with credit risk, too, which allows to take into account the possibility of bankrupt of the counterparty of a contract. In this context, the general method of pricing is derived, with the introduction of defaultable zero-coupon bonds, and the Monte Carlo method is treated in detailed and used to price a concrete example of contract. Appendixes: A: martingale measures, Girsanov's theorem and the change of numeraire. B: some aspects of the theory of Stochastic Differential Equations; in particular, the solution for linear EDSs, and the Feynman-Kac Theorem, which shows the connection between EDSs and Partial Differential Equations. C: some useful results about normal distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

dall'avvento della liberalizzazione, aeroporti e vettori hanno vissuto cambiamenti. Il maggior miglioramneto nella gestione degli aeroporti è una gestione più commerciale ed efficiente. Le forme di regolazione economica e le caratteristiche della gestione manageriale sono state indagate. Dodici paesi sono stati scelti per indagare la situazione del trasporto aereo mondiale, fra questi sia paesi con un sistema maturo sia paesi emergenti. La distribuzione del traffico è stata analizzata con l'indice HHI per evidenziare aeroporti con concentrazione maggiore di 0,25 (in accordo con la normativa statunitense); il sistema aeroportuale è stato analizzato con l'indice di Gini e con l'indice di dominanza. Infine, la teoria dei giochi si è dimostrata un valido supporto per studiare il mercato del trasporto aereo anche con l'uso di giochi di tipo DP

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dissertation contains five parts: An introduction, three major chapters, and a short conclusion. The First Chapter starts from a survey and discussion of the studies on corporate law and financial development literature. The commonly used methods in these cross-sectional analyses are biased as legal origins are no longer valid instruments. Hence, the model uncertainty becomes a salient problem. The Bayesian Model Averaging algorithm is applied to test the robustness of empirical results in Djankov et al. (2008). The analysis finds that their constructed legal index is not robustly correlated with most of the various stock market outcome variables. The second Chapter looks into the effects of minority shareholders protection in corporate governance regime on entrepreneurs' ex ante incentives to undertake IPO. Most of the current literature focuses on the beneficial part of minority shareholder protection on valuation, while overlooks its private costs on entrepreneur's control. As a result, the entrepreneur trade-offs the costs of monitoring with the benefits of cheap sources of finance when minority shareholder protection improves. The theoretical predictions are empirically tested using panel data and GMM-sys estimator. The third Chapter investigates the corporate law and corporate governance reform in China. The corporate law in China regards shareholder control as the means to the ends of pursuing the interests of stakeholders, which is inefficient. The Chapter combines the recent development of theories of the firm, i.e., the team production theory and the property rights theory, to solve such problem. The enlightened shareholder value, which emphasizes on the long term valuation of the firm, should be adopted as objectives of listed firms. In addition, a move from the mandatory division of power between shareholder meeting and board meeting to the default regime, is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation comprises three essays on the Turkish labor market. The first essay characterizes the distinctive characteristics of the Turkish labor market with the aim of understanding the factors lying behind its long-standing poor performance relative to its European counterparts. The analysis is based on a cross-country comparison among selected European Union countries. Among all the indicators of labor market flexibility, non-wage cost rigidities are regarded as one of the most important factors in slowing down employment creation in Turkey. The second essay focuses on an employment subsidy policy which introduces a reduction in non-wage costs through social security premium incentives granted to women and young men. Exploiting a difference-in-difference-in differences strategy, I evaluate the effectiveness of this policy in creating employment for the target group. The results, net of the recent crisis effect, suggest that the policy accounts for a 1.4% to 1.6% increase in the probability of being hired for women aged 30 to 34 above men of the same age group in the periods shortly after the announcement of the policy. In the third essay of the dissertation, I analyze the labor supply response of married women to their husbands' job losses (AWE). I empirically test the hypothesis of added worker effect for the global economic crisis of 2008 by relying on the Turkey context. Identification is achieved by exploiting the exogenous variation in the output of male-dominated sectors hard-hit by the crisis and the gender-segmentation that characterizes the Turkish labor market. Findings based on the instrumental variable approach suggest that the added worker effect explains up to 64% of the observed increase in female labor force participation in Turkey. The size of the effect depends on how long it takes for wives to adjust their labor supply to their husbands' job losses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation mimics the Turkish college admission procedure. It started with the purpose to reduce the inefficiencies in Turkish market. For this purpose, we propose a mechanism under a new market structure; as we prefer to call, semi-centralization. In chapter 1, we give a brief summary of Matching Theory. We present the first examples in Matching history with the most general papers and mechanisms. In chapter 2, we propose our mechanism. In real life application, that is in Turkish university placements, the mechanism reduces the inefficiencies of the current system. The success of the mechanism depends on the preference profile. It is easy to show that under complete information the mechanism implements the full set of stable matchings for a given profile. In chapter 3, we refine our basic mechanism. The modification on the mechanism has a crucial effect on the results. The new mechanism is, as we call, a middle mechanism. In one of the subdomain, this mechanism coincides with the original basic mechanism. But, in the other partition, it gives the same results with Gale and Shapley's algorithm. In chapter 4, we apply our basic mechanism to well known Roommate Problem. Since the roommate problem is in one-sided game patern, firstly we propose an auxiliary function to convert the game semi centralized two-sided game, because our basic mechanism is designed for this framework. We show that this process is succesful in finding a stable matching in the existence of stability. We also show that our mechanism easily and simply tells us if a profile lacks of stability by using purified orderings. Finally, we show a method to find all the stable matching in the existence of multi stability. The method is simply to run the mechanism for all of the top agents in the social preference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mr. Pechersky set out to examine a specific feature of the employer-employee relationship in Russian business organisations. He wanted to study to what extent the so-called "moral hazard" is being solved (if it is being solved at all), whether there is a relationship between pay and performance, and whether there is a correlation between economic theory and Russian reality. Finally, he set out to construct a model of the Russian economy that better reflects the way it actually functions than do certain other well-known models (for example models of incentive compensation, the Shapiro-Stiglitz model etc.). His report was presented to the RSS in the form of a series of manuscripts in English and Russian, and on disc, with many tables and graphs. He begins by pointing out the different examples of randomness that exist in the relationship between employee and employer. Firstly, results are frequently affected by circumstances outside the employee's control that have nothing to do with how intelligently, honestly, and diligently the employee has worked. When rewards are based on results, uncontrollable randomness in the employee's output induces randomness in their incomes. A second source of randomness involves the outside events that are beyond the control of the employee that may affect his or her ability to perform as contracted. A third source of randomness arises when the performance itself (rather than the result) is measured, and the performance evaluation procedures include random or subjective elements. Mr. Pechersky's study shows that in Russia the third source of randomness plays an important role. Moreover, he points out that employer-employee relationships in Russia are sometimes opposite to those in the West. Drawing on game theory, he characterises the Western system as follows. The two players are the principal and the agent, who are usually representative individuals. The principal hires an agent to perform a task, and the agent acquires an information advantage concerning his actions or the outside world at some point in the game, i.e. it is assumed that the employee is better informed. In Russia, on the other hand, incentive contracts are typically negotiated in situations in which the employer has the information advantage concerning outcome. Mr. Pechersky schematises it thus. Compensation (the wage) is W and consists of a base amount, plus a portion that varies with the outcome, x. So W = a + bx, where b is used to measure the intensity of the incentives provided to the employee. This means that one contract will be said to provide stronger incentives than another if it specifies a higher value for b. This is the incentive contract as it operates in the West. The key feature distinguishing the Russian example is that x is observed by the employer but is not observed by the employee. So the employer promises to pay in accordance with an incentive scheme, but since the outcome is not observable by the employee the contract cannot be enforced, and the question arises: is there any incentive for the employer to fulfil his or her promises? Mr. Pechersky considers two simple models of employer-employee relationships displaying the above type of information symmetry. In a static framework the obtained result is somewhat surprising: at the Nash equilibrium the employer pays nothing, even though his objective function contains a quadratic term reflecting negative consequences for the employer if the actual level of compensation deviates from the expectations of the employee. This can lead, for example, to labour turnover, or the expenses resulting from a bad reputation. In a dynamic framework, the conclusion can be formulated as follows: the higher the discount factor, the higher the incentive for the employer to be honest in his/her relationships with the employee. If the discount factor is taken to be a parameter reflecting the degree of (un)certainty (the higher the degree of uncertainty is, the lower is the discount factor), we can conclude that the answer to the formulated question depends on the stability of the political, social and economic situation in a country. Mr. Pechersky believes that the strength of a market system with private property lies not just in its providing the information needed to compute an efficient allocation of resources in an efficient manner. At least equally important is the manner in which it accepts individually self-interested behaviour, but then channels this behaviour in desired directions. People do not have to be cajoled, artificially induced, or forced to do their parts in a well-functioning market system. Instead, they are simply left to pursue their own objectives as they see fit. Under the right circumstances, people are led by Adam Smith's "invisible hand" of impersonal market forces to take the actions needed to achieve an efficient, co-ordinated pattern of choices. The problem is that, as Mr. Pechersky sees it, there is no reason to believe that the circumstances in Russia are right, and the invisible hand is doing its work properly. Political instability, social tension and other circumstances prevent it from doing so. Mr. Pechersky believes that the discount factor plays a crucial role in employer-employee relationships. Such relationships can be considered satisfactory from a normative point of view, only in those cases where the discount factor is sufficiently large. Unfortunately, in modern Russia the evidence points to the typical discount factor being relatively small. This fact can be explained as a manifestation of aversion to risk of economic agents. Mr. Pechersky hopes that when political stabilisation occurs, the discount factors of economic agents will increase, and the agent's behaviour will be explicable in terms of more traditional models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High density oligonucleotide expression arrays are a widely used tool for the measurement of gene expression on a large scale. Affymetrix GeneChip arrays appear to dominate this market. These arrays use short oligonucleotides to probe for genes in an RNA sample. Due to optical noise, non-specific hybridization, probe-specific effects, and measurement error, ad-hoc measures of expression, that summarize probe intensities, can lead to imprecise and inaccurate results. Various researchers have demonstrated that expression measures based on simple statistical models can provide great improvements over the ad-hoc procedure offered by Affymetrix. Recently, physical models based on molecular hybridization theory, have been proposed as useful tools for prediction of, for example, non-specific hybridization. These physical models show great potential in terms of improving existing expression measures. In this paper we demonstrate that the system producing the measured intensities is too complex to be fully described with these relatively simple physical models and we propose empirically motivated stochastic models that compliment the above mentioned molecular hybridization theory to provide a comprehensive description of the data. We discuss how the proposed model can be used to obtain improved measures of expression useful for the data analysts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most influential statements in the anomie theory tradition has been Merton’s argument that the volume of instrumental property crime should be higher where there is a greater imbalance between the degree of commitment to monetary success goals and the degree of commitment to legitimate means of pursing such goals. Contemporary anomie theories stimulated by Merton’s perspective, most notably Messner and Rosenfeld’s institutional anomie theory, have expanded the scope conditions by emphasizing lethal criminal violence as an outcome to which anomie theory is highly relevant, and virtually all contemporary empirical studies have focused on applying the perspective to explaining spatial variation in homicide rates. In the present paper, we argue that current explications of Merton’s theory and IAT have not adequately conveyed the relevance of the core features of the anomie perspective to lethal violence. We propose an expanded anomie model in which an unbalanced pecuniary value system – the core causal variable in Merton’s theory and IAT – translates into higher levels of homicide primarily in indirect ways by increasing levels of firearm prevalence, drug market activity, and property crime, and by enhancing the degree to which these factors stimulate lethal outcomes. Using aggregate-level data collected during the mid-to-late 1970s for a sample of relatively large social aggregates within the U.S., we find a significant effect on homicide rates of an interaction term reflecting high levels of commitment to monetary success goals and low levels of commitment to legitimate means. Virtually all of this effect is accounted for by higher levels of property crime and drug market activity that occur in areas with an unbalanced pecuniary value system. Our analysis also reveals that property crime is more apt to lead to homicide under conditions of high levels of structural disadvantage. These and other findings underscore the potential value of elaborating the anomie perspective to explicitly account for lethal violence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the UsedSoft ruling of the CJEU in 2012, there has been the distinct feeling that – like the big bang - UsedSoft signals the start of a new beginning. As we enter this brave new world, the Copyright Directive will be read anew: misalignments in the treatment of physical and digital content will be resolved; accessibility and affordability for consumers will be heightened; and lock-in will be reduced as e-exhaustion takes hold. With UsedSoft as a precedent, the Court can do nothing but keep expanding its own ruling. For big bang theorists, it is only a matter of time until the digital first sale meteor strikes non-software downloads also. This paper looks at whether the UsedSoft ruling could indeed be the beginning of a wider doctrine of e-exhaustion, or if it is simply a one-shot comet restrained by provisions of the Computer Program Directive on which it was based. Fighting the latter corner, we have the strict word of the law; in the UsedSoft ruling, the Court appears to willingly bypass the international legal framework of the WCT. As far as expansion goes, the Copyright Directive was conceived specifically to implement the WCT, thus the legislative intent is clear. The Court would not, surely, invoke its modicum of creativity there also... With perhaps undue haste in a digital market of many unknowns, it seems this might well be the case. Provoking the big bang theory of e-exhaustion, the UsedSoft ruling can be read as distinctly purposive, but rather than having copyright norms in mind, the standard for the Court is the same free movement rules that underpin the exhaustion doctrine in the physical world. With an endowed sense of principled equivalence, the Court clearly wishes the tangible and intangible rules to be aligned. Against the backdrop of the European internal market, perhaps few legislative instruments would staunchly stand in its way. With firm objectives in mind, the UsedSoft ruling could be a rather disruptive meteor indeed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information theory-based metric such as mutual information (MI) is widely used as similarity measurement for multimodal registration. Nevertheless, this metric may lead to matching ambiguity for non-rigid registration. Moreover, maximization of MI alone does not necessarily produce an optimal solution. In this paper, we propose a segmentation-assisted similarity metric based on point-wise mutual information (PMI). This similarity metric, termed SPMI, enhances the registration accuracy by considering tissue classification probabilities as prior information, which is generated from an expectation maximization (EM) algorithm. Diffeomorphic demons is then adopted as the registration model and is optimized in a hierarchical framework (H-SPMI) based on different levels of anatomical structure as prior knowledge. The proposed method is evaluated using Brainweb synthetic data and clinical fMRI images. Both qualitative and quantitative assessment were performed as well as a sensitivity analysis to the segmentation error. Compared to the pure intensity-based approaches which only maximize mutual information, we show that the proposed algorithm provides significantly better accuracy on both synthetic and clinical data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose This paper furthers the analysis of patterns regulating capitalist accumulation based on a historical anthropology of economic activities revolving around and within the Mauritian Export Processing Zone (EPZ). Design/methodology/approach This paper uses fieldwork in Mauritius to interrogate and critique two important concepts in contemporary social theory – “embeddedness” and “the informal economy.” These are viewed in the wider frame of social anthropology’s engagement with (neoliberal) capitalism. Findings A process-oriented revision of Polanyi’s work on embeddedness and the “double movement” is proposed to help us situate EPZs within ongoing power struggles found throughout the history of capitalism. This helps us to challenge the notion of economic informality as supplied by Hart and others. Social implications Scholars and policymakers have tended to see economic informality as a force from below, able to disrupt the legal-rational nature of capitalism as practiced from on high. Similarly, there is a view that a precapitalist embeddedness, a “human economy,” has many good things to offer. However, this paper shows that the practices of the state and multinational capitalism, in EPZs and elsewhere, exactly match the practices that are envisioned as the cure to the pitfalls of capitalism. Value of the paper Setting aside the formal-informal distinction in favor of a process-oriented analysis of embeddedness allows us better to understand the shifting struggles among the state, capital, and labor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spurred by the consumer market, companies increasingly deploy smartphones or tablet computers in their operations. However, unlike private users, companies typically struggle to cover their needs with existing applications, and therefore expand mobile software platforms through customized applications from multiple software vendors. Companies thereby combine the concepts of multi-sourcing and software platform ecosystems in a novel platform-based multi-sourcing setting. This implies, however, the clash of two different approaches towards the coordination of the underlying one-to-many inter-organizational relationships. So far, however, little is known about impacts of merging coordination approaches. Relying on convention theory, we addresses this gap by analyzing a platform-based multi-sourcing project between a client and six software vendors, that develop twenty-three custom-made applications on a common platform (Android). In doing so, we aim to understand how unequal coordination approaches merge, and whether and for what reason particular coordination mechanisms, design decisions, or practices disappear, while new ones emerge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent theoretical work has examined the spatial distribution of unemployment using the efficiency wage model as the mechanism by which unemployment arises in the urban economy. This paper extends the standard efficiency wage model in order to allow for behavioral substitution between leisure time at home and effort at work. In equilibrium, residing at a location with a long commute affects the time available for leisure at home and therefore affects the trade-off between effort at work and risk of unemployment. This model implies an empirical relationship between expected commutes and labor market outcomes, which is tested using the Public Use Microdata sample of the 2000 U.S. Decennial Census. The empirical results suggest that efficiency wages operate primarily for blue collar workers, i.e. workers who tend to be in occupations that face higher levels of supervision. For this subset of workers, longer commutes imply higher levels of unemployment and higher wages, which are both consistent with shirking and leisure being substitutable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regulatory change not seen since the Great Depression swept the U.S. banking industry beginning in the early 1980s, culminating with the Interstate Banking and Branching Efficiency Act of 1994. Significant consolidations have occurred in the banking industry. This paper considers the market-power versus the efficient-structure theories of the positive correlation between banking concentration and performance on a state-by-state basis. Temporal causality tests imply that bank concentration leads bank profitability, supporting the market-power, rather than the efficient-structure, theory of that positive correlation. Our finding suggests that bank regulators, by focusing on local banking markets, missed the initial stages of an important structural change at the state level.