815 resultados para [JEL:D46] Microeconomics - Market Structure and Pricing - Value Theory
Resumo:
The development of the ecosystem approach and models for the management of ocean marine resources requires easy access to standard validated datasets of historical catch data for the main exploited species. They are used to measure the impact of biomass removal by fisheries and to evaluate the models skills, while the use of standard dataset facilitates models inter-comparison. North Atlantic albacore tuna is exploited all year round by longline and in summer and autumn by surface fisheries and fishery statistics compiled by the International Commission for the Conservation of Atlantic Tunas (ICCAT). Catch and effort with geographical coordinates at monthly spatial resolution of 1° or 5° squares were extracted for this species with a careful definition of fisheries and data screening. In total, thirteen fisheries were defined for the period 1956-2010, with fishing gears longline, troll, mid-water trawl and bait fishing. However, the spatialized catch effort data available in ICCAT database represent a fraction of the entire total catch. Length frequencies of catch were also extracted according to the definition of fisheries above for the period 1956-2010 with a quarterly temporal resolution and spatial resolutions varying from 1°x 1° to 10°x 20°. The resolution used to measure the fish also varies with size-bins of 1, 2 or 5 cm (Fork Length). The screening of data allowed detecting inconsistencies with a relatively large number of samples larger than 150 cm while all studies on the growth of albacore suggest that fish rarely grow up over 130 cm. Therefore, a threshold value of 130 cm has been arbitrarily fixed and all length frequency data above this value removed from the original data set.
Resumo:
The development of the ecosystem approach and models for the management of ocean marine resources requires easy access to standard validated datasets of historical catch data for the main exploited species. They are used to measure the impact of biomass removal by fisheries and to evaluate the models skills, while the use of standard dataset facilitates models inter-comparison. North Atlantic albacore tuna is exploited all year round by longline and in summer and autumn by surface fisheries and fishery statistics compiled by the International Commission for the Conservation of Atlantic Tunas (ICCAT). Catch and effort with geographical coordinates at monthly spatial resolution of 1° or 5° squares were extracted for this species with a careful definition of fisheries and data screening. In total, thirteen fisheries were defined for the period 1956-2010, with fishing gears longline, troll, mid-water trawl and bait fishing. However, the spatialized catch effort data available in ICCAT database represent a fraction of the entire total catch. Length frequencies of catch were also extracted according to the definition of fisheries above for the period 1956-2010 with a quarterly temporal resolution and spatial resolutions varying from 1°x 1° to 10°x 20°. The resolution used to measure the fish also varies with size-bins of 1, 2 or 5 cm (Fork Length). The screening of data allowed detecting inconsistencies with a relatively large number of samples larger than 150 cm while all studies on the growth of albacore suggest that fish rarely grow up over 130 cm. Therefore, a threshold value of 130 cm has been arbitrarily fixed and all length frequency data above this value removed from the original data set.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
OBJECTIVES We sought to assess the prognostic utility of brachial artery reactivity (BAR) in patients at risk of cardiovascular events. BACKGROUND Impaired flow-mediated vasodilation measured by BAR is a marker of endothelial dysfunction. Brachial artery reactivity is influenced by risk factors and is responsive to various pharmacological and other treatments. However, its prognostic importance is uncertain, especially relative to other predictors of outcome. METHODS A total of 444 patients were prospectively enrolled to undergo BAR and follow-up. These patients were at risk of cardiovascular events, based on the presence of risk factors or known or suspected cardiovascular disease. We took a full clinical history, performed BAR, and obtained carotid intima-media thickness (IMT) and left ventricular mass and ejection fraction. Patients were followed up for cardiovascular events and all-cause mortality. Multivariate Cox regression analysis was performed to assess the independent association of investigation variables on outcomes. RESULTS The patients exhibited abnormal BAR (5.2 +/- 6.1% [mean +/- SD]) but showed normal nitrate-mediated dilation (9.9 +/- 7.2%) and normal mean IMT (0.67 +/- 0.12 mm [average]). Forty-nine deaths occurred over the median follow-up period of 24 months (interquartile range 10 to 34). Patients in the lowest tertile group of BAR (<2%) had significantly more events than those in the combined group of highest and mid-tertiles (p = 0.029, log-rank test). However, mean IMT (rather than flow-mediated dilation) was the vascular factor independently associated with mortality, even in the subgroup (n = 271) with no coronary artery disease and low risk. CONCLUSIONS Brachial artery reactivity is lower in patients with events, but is not an independent predictor of cardiovascular outcomes in this cohort of patients. (C) 2004 by the American College of Cardiology Foundation.
Resumo:
Based on a newly established sequencing strategy featured by its efficiency, simplicity, and easy manipulation, the sequences of four novel cyclotides (macrocyclic knotted proteins) isolated from an Australian plant Viola hederaceae were determined. The three-dimensional solution structure of V. hederaceae leaf cyclotide-1 ( vhl-1), a leaf-specific expressed 31-residue cyclotide, has been determined using two-dimensional H-1 NMR spectroscopy. vhl-1 adopts a compact and well defined structure including a distorted triple-stranded β- sheet, a short 310 helical segment and several turns. It is stabilized by three disulfide bonds, which, together with backbone segments, form a cyclic cystine knot motif. The three-disulfide bonds are almost completely buried into the protein core, and the six cysteines contribute only 3.8% to the molecular surface. A pH titration experiment revealed that the folding of vhl-1 shows little pH dependence and allowed the pK(a) of 3.0 for Glu(3) and &SIM; 5.0 for Glu(14) to be determined. Met(7) was found to be oxidized in the native form, consistent with the fact that its side chain protrudes into the solvent, occupying 7.5% of the molecular surface. vhl-1 shows anti-HIV activity with an EC50 value of 0.87 μ m.
Resumo:
The temperature dependence of the X-ray crystal structure and powder EPR spectrum of [(HC(Ph2PO)(3))(2)CU]-(ClO4)(2)center dot 2H(2)O is reported, and the structure at room temperature confirms that reported previously. Below similar to 100 K, the data imply a geometry with near elongated tetragonal symmetry for the [(HC(Ph2PO)(3))(2)Cu](2+) complex, but on warming the two higher Cu-O bond lengths and g-values progressively converge, and by 340 K the bond lengths correspond to a compressed tetragonal geometry. The data may be interpreted satisfactorily assuming an equilibrium among the energy levels of a Cu-O-6 polyhedron subjected to Jahn-Teller vibronic coupling and a lattice strain. However, agreement with the experiment is obtained only if the orthorhombic component of the lattice strain decreases to a negligible value as the temperature approaches 340 K.
Resumo:
Selection of a power market structure from the available alternatives is an important activity within an overall power sector reform programme. The evaluation criteria for selection are both subjective as well as objective in nature and the selection of alternatives is characterised by their conflicting nature. This study demonstrates a methodology for power market structure selection using the analytic hierarchy process (AHP), a multiple attribute decision-making technique, to model the selection methodology with the active participation of relevant stakeholders in a workshop environment. The methodology is applied to a hypothetical case of a State Electricity Board reform in India.
Resumo:
Purpose – The purpose of this paper is to investigate the joint effects of market orientation (MO) and corporate social responsibility (CSR) on firm performance. Design/methodology/approach – Data were collected via a questionnaire survey of star-rated hotels in China and a total of 143 valid responses were received. The hypotheses were tested by employing structural equation modelling with a maximum likelihood estimation option. Findings – It was found that although both MO and CSR could enhance performance, once the effects of CSR are accounted for, the direct effects of MO on performance diminish considerably to almost non-existent. Although this result may be due to the fact that the research is conducted in China, a country where CSR might be crucially important to performance given the country's socialist legacy, it nonetheless provides strong evidence that MO's impact on organizational performance is mediated by CSR. Research limitations/implications – The main limitations include the use of cross-sectional data, the subjective measurement of performance and the uniqueness of the research setting (China). The findings provide an additional important insight into the processes by which a market oriented culture is transformed into superior organizational performance. Originality/value – This paper is one of the first to examine the joint effects of MO and CSR on business performance. The empirical evidence from China adds to the existing literature on the respective importance of MO and CSR.
Resumo:
In future massively distributed service-based computational systems, resources will span many locations, organisations and platforms. In such systems, the ability to allocate resources in a desired configuration, in a scalable and robust manner, will be essential.We build upon a previous evolutionary market-based approach to achieving resource allocation in decentralised systems, by considering heterogeneous providers. In such scenarios, providers may be said to value their resources differently. We demonstrate how, given such valuations, the outcome allocation may be predicted. Furthermore, we describe how the approach may be used to achieve a stable, uneven load-balance of our choosing. We analyse the system's expected behaviour, and validate our predictions in simulation. Our approach is fully decentralised; no part of the system is weaker than any other. No cooperation between nodes is assumed; only self-interest is relied upon. A particular desired allocation is achieved transparently to users, as no modification to the buyers is required.
Resumo:
Purpose – The purpose of this paper is to present a conceptual framework in order to analyse and understand the twin developments of successful microeconomic reform on the one hand and failed macroeconomic stabilisation attempts on the other hand in Hungary. The case study also attempts to explore the reasons why Hungarian policymakers were willing to initiate reforms in the micro sphere, but were reluctant to initiate major changes in public finances both before and after the regime change of 1989/1990. Design/methodology/approach – The paper applies a path-dependent approach by carefully analysing Hungary's Communist and post-Communist economic development. The study restricts itself to a positive analysis but normative statements can also be drawn accordingly. Findings – The study demonstrates that the recent deteriorating economic performance of Hungary is not a recent phenomenon. By providing a path-dependent explanation, it argues that both Communist and post-Communist governments used the general budget as a buffer to compensate the losers of economic reforms, especially microeconomic restructuring. The gradualist success of the country – which dates back to at least 1968 – in the field of liberalisation, marketisation and privatisation was accompanied by a constant overspending in the general government. Practical implications – Hungary has been one of the worst-hit countries of the 2008/2009 financial crisis, not just in Central and Eastern Europe but in the whole world. The capacity and opportunity for strengthening international investors' confidence is, however, not without doubts. The current deterioration is deeply rooted in failed past macroeconomic management. The dissolution of fiscal laxity and state paternalism in a broader context requires, therefore, an all-encompassing reform of the general government, which may trigger serious challenges to the political regime as well. Originality/value – The study aims to show that a relatively high ratio of redistribution, a high and persistent public deficit and an accelerated indebtedness are not recent phenomena in Hungary. In fact, these trends characterised the country well before the transformation of 1989/1990, and have continued in the post-socialist years, too. To explain such a phenomenon, the study argues that in the last couple of decades the hardening of the budget constraint of firms have come at the cost of maintaining the soft budget constraint of the state.
Resumo:
Extreme stock price movements are of great concern to both investors and the entire economy. For investors, a single negative return, or a combination of several smaller returns, can possible wipe out so much capital that the firm or portfolio becomes illiquid or insolvent. If enough investors experience this loss, it could shock the entire economy. An example of such a case is the stock market crash of 1987. Furthermore, there has been a lot of recent interest regarding the increasing volatility of stock prices. ^ This study presents an analysis of extreme stock price movements. The data utilized was the daily returns for the Standard and Poor's 500 index from January 3, 1978 to May 31, 2001. Research questions were analyzed using the statistical models provided by extreme value theory. One of the difficulties in examining stock price data is that there is no consensus regarding the correct shape of the distribution function generating the data. An advantage with extreme value theory is that no detailed knowledge of this distribution function is required to apply the asymptotic theory. We focus on the tail of the distribution. ^ Extreme value theory allows us to estimate a tail index, which we use to derive bounds on the returns for very low probabilities on an excess. Such information is useful in evaluating the volatility of stock prices. There are three possible limit laws for the maximum: Gumbel (thick-tailed), Fréchet (thin-tailed) or Weibull (no tail). Results indicated that extreme returns during the time period studied follow a Fréchet distribution. Thus, this study finds that extreme value analysis is a valuable tool for examining stock price movements and can be more efficient than the usual variance in measuring risk. ^
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Resumo:
This case study examines the factors that shaped the identity and landscape of a small island-urban-village between the north and south forks of the Middle River and north of an urban area in Broward County, Florida. The purpose of the study is to understand how Wilton Manors was transformed from a “whites only” enclave to the contemporary upscale, diverse, and third gayest city in the U.S. by positing that a dichotomy for urban places exists between their exchange value as seen by Logan and Molotch and the use value produced through everyday activity according to Lefebvre. Qualitative methods were used to gather evidence for reaching conclusions about the relationship among the worldview of residents, the tension between exchange value and use value in the restructuration of the city, and the transformation of Wilton Manors at the end of the 1990s. Semi-structured, in-depth interviews were conducted with 21 contemporary participants. In addition, thirteen taped CDs of selected members of founding families, previously taped in the 1970s, were analyzed using a grounded theory approach. My findings indicate that Wilton Manors’ residents share a common worldview which incorporates social inclusion as a use value, and individual agency in the community. This shared worldview can be traced to selected city pioneers whose civic mindedness helped shape city identity and laid the foundation for future restructuration. Currently, residents’ quality of life reflected in the city’s use value is more significant than exchange value as a primary force in the decisions that are made about the city’s development. With innovative ideas, buildings emulating the new urban mixed-use design, and a reputation as the third gayest city in the United States, Wilton Manors reflects a worldview where residents protect use value as primary over market value in the decisions they make that shape their city but not without contestation.^
Resumo:
For producers motivated by their new status as self-employed, landowning, capitalist coffee growers, specialty coffee presents an opportunity to proactively change the way they participate in the international market. Now responsible for determining their own path, many producers have jumped at the chance to enhance the value of their product and participate in the new "fair trade" market. But recent trends in the international coffee price have led many producers to wonder why their efforts to produce a certified Fair Trade and organic product are not generating the price advantage they had anticipated. My study incorporates data collected in eighteen months of fieldwork, including more than 45 interviews with coffee producers and fair trade roasters, 90 surveys of coffee growers, and ongoing participant observation to understand how fair trade certification, as both a market system and development program, meets the expectations of the coffee growers. By comparing three coffee cooperatives that have engaged the Fair Trade system to disparate ends, the results of this investigation are three case studies that demonstrate how global processes of certification, commodity trade, market interaction, and development aid effect social and cultural change within communities. This study frames several lessons learned in terms of (1) socioeconomic impacts of fair trade, (2) characteristics associated with positive development encounters, and (3) potential for commodity producers to capture value further along their global value chain. Commodity chain comparisons indicate the Fair Trade certified cooperative receives the highest per-pound price, though these findings are complicated by costs associate with certification and producers' perceptions of an "unjust" system. Fair trade-supported projects are demonstrated as more "successful" in the eyes of recipients, though their attention to detail can just as easily result in "failure". Finally, survey results reveal just how limited is the market knowledge of producers in each cooperative, though fair trade does, in fact, provide a rare opportunity for producers to learn about consumer demand for coffee quality. Though bittersweet, the fair trade experiences described here present a learning opportunity for a wide range of audiences, from the certified to the certifiers to the concerned public and conscientious consumer.
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.