918 resultados para Least squares method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Metals price risk management is a key issue related to financial risk in metal markets because of uncertainty of commodity price fluctuation, exchange rate, interest rate changes and huge price risk either to metals’ producers or consumers. Thus, it has been taken into account by all participants in metal markets including metals’ producers, consumers, merchants, banks, investment funds, speculators, traders and so on. Managing price risk provides stable income for both metals’ producers and consumers, so it increases the chance that a firm will invest in attractive projects. The purpose of this research is to evaluate risk management strategies in the copper market. The main tools and strategies of price risk management are hedging and other derivatives such as futures contracts, swaps and options contracts. Hedging is a transaction designed to reduce or eliminate price risk. Derivatives are financial instruments, whose returns are derived from other financial instruments and they are commonly used for managing financial risks. Although derivatives have been around in some form for centuries, their growth has accelerated rapidly during the last 20 years. Nowadays, they are widely used by financial institutions, corporations, professional investors, and individuals. This project is focused on the over-the-counter (OTC) market and its products such as exotic options, particularly Asian options. The first part of the project is a description of basic derivatives and risk management strategies. In addition, this part discusses basic concepts of spot and futures (forward) markets, benefits and costs of risk management and risks and rewards of positions in the derivative markets. The second part considers valuations of commodity derivatives. In this part, the options pricing model DerivaGem is applied to Asian call and put options on London Metal Exchange (LME) copper because it is important to understand how Asian options are valued and to compare theoretical values of the options with their market observed values. Predicting future trends of copper prices is important and would be essential to manage market price risk successfully. Therefore, the third part is a discussion about econometric commodity models. Based on this literature review, the fourth part of the project reports the construction and testing of an econometric model designed to forecast the monthly average price of copper on the LME. More specifically, this part aims at showing how LME copper prices can be explained by means of a simultaneous equation structural model (two-stage least squares regression) connecting supply and demand variables. A simultaneous econometric model for the copper industry is built: {█(Q_t^D=e^((-5.0485))∙P_((t-1))^((-0.1868) )∙〖GDP〗_t^((1.7151) )∙e^((0.0158)∙〖IP〗_t ) @Q_t^S=e^((-3.0785))∙P_((t-1))^((0.5960))∙T_t^((0.1408))∙P_(OIL(t))^((-0.1559))∙〖USDI〗_t^((1.2432))∙〖LIBOR〗_((t-6))^((-0.0561))@Q_t^D=Q_t^S )┤ P_((t-1))^CU=e^((-2.5165))∙〖GDP〗_t^((2.1910))∙e^((0.0202)∙〖IP〗_t )∙T_t^((-0.1799))∙P_(OIL(t))^((0.1991))∙〖USDI〗_t^((-1.5881))∙〖LIBOR〗_((t-6))^((0.0717) Where, Q_t^D and Q_t^Sare world demand for and supply of copper at time t respectively. P(t-1) is the lagged price of copper, which is the focus of the analysis in this part. GDPt is world gross domestic product at time t, which represents aggregate economic activity. In addition, industrial production should be considered here, so the global industrial production growth that is noted as IPt is included in the model. Tt is the time variable, which is a useful proxy for technological change. A proxy variable for the cost of energy in producing copper is the price of oil at time t, which is noted as POIL(t ) . USDIt is the U.S. dollar index variable at time t, which is an important variable for explaining the copper supply and copper prices. At last, LIBOR(t-6) is the 6-month lagged 1-year London Inter bank offering rate of interest. Although, the model can be applicable for different base metals' industries, the omitted exogenous variables such as the price of substitute or a combined variable related to the price of substitutes have not been considered in this study. Based on this econometric model and using a Monte-Carlo simulation analysis, the probabilities that the monthly average copper prices in 2006 and 2007 will be greater than specific strike price of an option are defined. The final part evaluates risk management strategies including options strategies, metal swaps and simple options in relation to the simulation results. The basic options strategies such as bull spreads, bear spreads and butterfly spreads, which are created by using both call and put options in 2006 and 2007 are evaluated. Consequently, each risk management strategy in 2006 and 2007 is analyzed based on the day of data and the price prediction model. As a result, applications stemming from this project include valuing Asian options, developing a copper price prediction model, forecasting and planning, and decision making for price risk management in the copper market.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Heroin prices are a reflection of supply and demand, and similar to any other market, profits motivate participation. The intent of this research is to examine the change in Afghan opium production due to political conflict affecting Europe’s heroin market and government policies. If the Taliban remain in power, or a new Afghan government is formed, the changes will affect the heroin market in Europe to a certain degree. In the heroin market, the degree of change is dependent on many socioeconomic forces such as law enforcement, corruption, and proximity to Afghanistan. An econometric model that examines the degree of these socioeconomic effects has not been applied to the heroin trade in Afghanistan before. This research uses a two-stage least squares econometric model to reveal the supply and demand of heroin in 36 different countries from the Middle East to Western Europe in 2008. An application of the two-stage least squares model to the heroin market in Europe will attempt to predict the socioeconomic consequences of Afghanistan opium production.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION: Ultra-high-field whole-body systems (7.0 T) have a high potential for future human in vivo magnetic resonance imaging (MRI). In musculoskeletal MRI, biochemical imaging of articular cartilage may benefit, in particular. Delayed gadolinium-enhanced MRI of cartilage (dGEMRIC) and T2 mapping have shown potential at 3.0 T. Although dGEMRIC, allows the determination of the glycosaminoglycan content of articular cartilage, T2 mapping is a promising tool for the evaluation of water and collagen content. In addition, the evaluation of zonal variation, based on tissue anisotropy, provides an indicator of the nature of cartilage ie, hyaline or hyaline-like articular cartilage.Thus, the aim of our study was to show the feasibility of in vivo dGEMRIC, and T2 and T2* relaxation measurements, at 7.0 T MRI; and to evaluate the potential of T2 and T2* measurements in an initial patient study after matrix-associated autologous chondrocyte transplantation (MACT) in the knee. MATERIALS AND METHODS: MRI was performed on a whole-body 7.0 T MR scanner using a dedicated circular polarization knee coil. The protocol consisted of an inversion recovery sequence for dGEMRIC, a multiecho spin-echo sequence for standard T2 mapping, a gradient-echo sequence for T2* mapping and a morphologic PD SPACE sequence. Twelve healthy volunteers (mean age, 26.7 +/- 3.4 years) and 4 patients (mean age, 38.0 +/- 14.0 years) were enrolled 29.5 +/- 15.1 months after MACT. For dGEMRIC, 5 healthy volunteers (mean age, 32.4 +/- 11.2 years) were included. T1 maps were calculated using a nonlinear, 2-parameter, least squares fit analysis. Using a region-of-interest analysis, mean cartilage relaxation rate was determined as T1 (0) for precontrast measurements and T1 (Gd) for postcontrast gadopentate dimeglumine [Gd-DTPA(2-)] measurements. T2 and T2* maps were obtained using a pixelwise, monoexponential, non-negative least squares fit analysis; region-of-interest analysis was carried out for deep and superficial cartilage aspects. Statistical evaluation was performed by analyses of variance. RESULTS: Mean T1 (dGEMRIC) values for healthy volunteers showed slightly different results for femoral [T1 (0): 1259 +/- 277 ms; T1 (Gd): 683 +/- 141 ms] compared with tibial cartilage [T1 (0): 1093 +/- 281 ms; T1 (Gd): 769 +/- 150 ms]. Global mean T2 relaxation for healthy volunteers showed comparable results for femoral (T2: 56.3 +/- 15.2 ms; T2*: 19.7 +/- 6.4 ms) and patellar (T2: 54.6 +/- 13.0 ms; T2*: 19.6 +/- 5.2 ms) cartilage, but lower values for tibial cartilage (T2: 43.6 +/- 8.5 ms; T2*: 16.6 +/- 5.6 ms). All healthy cartilage sites showed a significant increase from deep to superficial cartilage (P < 0.001). Within healthy cartilage sites in MACT patients, adequate values could be found for T2 (56.6 +/- 13.2 ms) and T2* (18.6 +/- 5.3 ms), which also showed a significant stratification. Within cartilage repair tissue, global mean values showed no difference, with 55.9 +/- 4.9 ms for T2 and 16.2 +/- 6.3 ms for T2*. However, zonal assessment showed only a slight and not significant increase from deep to superficial cartilage (T2: P = 0.174; T2*: P = 0.150). CONCLUSION: In vivo T1 dGEMRIC assessment in healthy cartilage, and T2 and T2* mapping in healthy and reparative articular cartilage, seems to be possible at 7.0 T MRI. For T2 and T2*, zonal variation of articular cartilage could also be evaluated at 7.0 T. This zonal assessment of deep and superficial cartilage aspects shows promising results for the differentiation of healthy and affected articular cartilage. In future studies, optimized protocol selection, and sophisticated coil technology, together with increased signal at ultra-high-field MRI, may lead to advanced biochemical cartilage imaging.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: The aim of our study was to correlate global T2 values of microfracture repair tissue (RT) with clinical outcome in the knee joint. METHODS: We assessed 24 patients treated with microfracture in the knee joint. Magnetic resonance (MR) examinations were performed on a 3T MR unit, T2 relaxation times were obtained with a multi-echo spin-echo technique. T2 maps were obtained using a pixel wise, mono-exponential non-negative least squares fit analysis. Slices covering the cartilage RT were selected and region of interest analysis was done. An individual T2 index was calculated with global mean T2 of the RT and global mean T2 of normal, hyaline cartilage. The Lysholm score and the International Knee Documentation Committee (IKDC) knee evaluation forms were used for the assessment of clinical outcome. Bivariate correlation analysis and a paired, two tailed t test were used for statistics. RESULTS: Global T2 values of the RT [mean 49.8ms, standards deviation (SD) 7.5] differed significantly (P<0.001) from global T2 values of normal, hyaline cartilage (mean 58.5ms, SD 7.0). The T2 index ranged from 61.3 to 101.5. We found the T2 index to correlate with outcome of the Lysholm score (r(s)=0.641, P<0.001) and the IKDC subjective knee evaluation form (r(s)=0.549, P=0.005), whereas there was no correlation with the IKDC knee form (r(s)=-0.284, P=0.179). CONCLUSION: These findings indicate that T2 mapping is sensitive to assess RT function and provides additional information to morphologic MRI in the monitoring of microfracture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A great increase of private car ownership took place in China from 1980 to 2009 with the development of the economy. To explain the relationship between car ownership and economic and social changes, an ordinary least squares linear regression model is developed using car ownership per capita as the dependent variable with GDP, savings deposits and highway mileages per capita as the independent variables. The model is tested and corrected for econometric problems such as spurious correlation and cointegration. Finally, the regression model is used to project oil consumption by the Chinese transportation sector through 2015. The result shows that about 2.0 million barrels of oil will be consumed by private cars in conservative scenario, and about 2.6 million barrels of oil per day in high case scenario in 2015. Both of them are much higher than the consumption level of 2009, which is 1.9 million barrels per day. It also shows that the annual growth rate of oil demand by transportation is 2.7% - 3.1% per year in the conservative scenario, and 6.9% - 7.3% per year in the high case forecast scenario from 2010 to 2015. As a result, actions like increasing oil efficiency need to be taken to deal with challenges of the increasing demand for oil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the economic development of China, the demand for electricity generation is rapidly increasing. To explain electricity generation, we use gross GDP, the ratio of urban population to rural population, the average per capita income of urban residents, the electricity price for industry in Beijing, and the policy shift that took place in China. Ordinary least squares (OLS) is used to develop a model for the 1979-2009 period. During the process of designing the model, econometric methods are used to test and develop the model. The final model is used to forecast total electricity generation and assess the possible role of photovoltaic generation. Due to the high demand for resources and serious environmental problems, China is pushing to develop the photovoltaic industry. The system price of PV is falling; therefore, photovoltaics may be competitive in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Gamma-radiation exposure has both short- and long-term adverse health effects. The threat of modern terrorism places human populations at risk for radiological exposures, yet current medical countermeasures to radiation exposure are limited. Here we describe metabolomics for gamma-radiation biodosimetry in a mouse model. Mice were gamma-irradiated at doses of 0, 3 and 8 Gy (2.57 Gy/min), and urine samples collected over the first 24 h after exposure were analyzed by ultra-performance liquid chromatography-time-of-flight mass spectrometry (UPLC-TOFMS). Multivariate data were analyzed by orthogonal partial least squares (OPLS). Both 3- and 8-Gy exposures yielded distinct urine metabolomic phenotypes. The top 22 ions for 3 and 8 Gy were analyzed further, including tandem mass spectrometric comparison with authentic standards, revealing that N-hexanoylglycine and beta-thymidine are urinary biomarkers of exposure to 3 and 8 Gy, 3-hydroxy-2-methylbenzoic acid 3-O-sulfate is elevated in urine of mice exposed to 3 but not 8 Gy, and taurine is elevated after 8 but not 3 Gy. Gene Expression Dynamics Inspector (GEDI) self-organizing maps showed clear dose-response relationships for subsets of the urine metabolome. This approach is useful for identifying mice exposed to gamma radiation and for developing metabolomic strategies for noninvasive radiation biodosimetry in humans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The work environment characteristics of job stress, job variety, job autonomy, and supervision are theorized to affect the job satisfaction and organizational commitment of social and human service workers. Most research to date has focused upon the impact of these variables on job satisfaction, with little attention being paid to organizational commitment. To determine the effects these characteristics have on both job satisfaction and organizational commitment, data from a survey of social and human service employees across Northwest Ohio were examined. In Ordinary Least Squares regression, all four job characteristics had a significant impact on job satisfaction, while only job variety and supervision had statistically significant effects on organizational commitment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the Sarbanes-Oxley Act was passed in 2002, it has become commonplace in the advertising industry to use creativity-award-show prizes instead of gross income figures to attract new customers. Therefore, achieving a top creativity ranking and winning creativity awards have become high priorities in the advertising industry. Agencies and marketers have always wondered what elements in the advertising creation process would lead to the winning of creativity awards. Although this debate has been dominated by pure speculation about the success of different routines, approaches and strategies in winning creativity awards, for the first time our study delivers an empirical insight into the key drivers of creativity award success. We investigate what strategies and which elements of an advertising campaign are truly likely to lead to winning the maximum number of creativity awards. Using a sample of 108 campaigns, we identify factors that influence campaign success at international advertising award shows. We identify innovativeness and the integration of multiple channels as the key drivers of creativity award success. In contrast to industry beliefs, meaningful or personally connecting approaches do not seem to generate a significant benefit in terms of winning creativity awards. Finally, our data suggest that the use of so-called “fake campaigns” to win more creativity awards does not prove to be effective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract. A number of studies have shown that Fourier transform infrared spectroscopy (FTIRS) can be applied to quantitatively assess lacustrine sediment constituents. In this study, we developed calibration models based on FTIRS for the quantitative determination of biogenic silica (BSi; n = 420; gradient: 0.9–56.5 %), total organic carbon (TOC; n = 309; gradient: 0–2.9 %), and total inorganic carbon (TIC; n = 152; gradient: 0–0.4 %) in a 318 m-long sediment record with a basal age of 3.6 million years from Lake El’gygytgyn, Far East Russian Arctic. The developed partial least squares (PLS) regression models yield high cross-validated (CV) R2 CV = 0.86–0.91 and low root mean square error of crossvalidation (RMSECV) (3.1–7.0% of the gradient for the different properties). By applying these models to 6771 samples from the entire sediment record, we obtained detailed insight into bioproductivity variations in Lake El’gygytgyn throughout the middle to late Pliocene and Quaternary. High accumulation rates of BSi indicate a productivity maximum during the middle Pliocene (3.6–3.3 Ma), followed by gradually decreasing rates during the late Pliocene and Quaternary. The average BSi accumulation during the middle Pliocene was �3 times higher than maximum accumulation rates during the past 1.5 million years. The indicated progressive deterioration of environmental and climatic conditions in the Siberian Arctic starting at ca. 3.3 Ma is consistent with the first occurrence of glacial periods and the finally complete establishment of glacial–interglacial cycles during the Quaternary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this research is to examine the relative profitability of the firm within the nursing facility industry in Texas. An examination is made of the variables expected to affect profitability and of importance to the design and implementation of regulatory policy. To facilitate this inquiry, specific questions addressed are: (1) Do differences in ownership form affect profitability (defined as operating income before fixed costs)? (2) What impact does regional location have on profitability? (3) Do patient case-mix and access to care by Medicaid patients differ between proprietary and non-profit firms and facilities located in urban versus rural regions, and what association exists between these variables and profitability? (4) Are economies of scale present in the nursing home industry? (5) Do nursing facilities operate in a competitive output market characterized by the inability of a single firm to exhibit influence over market price?^ Prior studies have principally employed a cost function to assess efficiency differences between classifications of nursing facilities. The inherent weakness in this approach is that it only considers technical efficiency. Not both technical and price efficiency which are the two components of overall economic efficiency. One firm is more technically efficient compared to another if it is able to produce a given quantity of output at the least possible costs. Price efficiency means that scarce resources are being directed towards their most valued use. Assuming similar prices in both input and output markets, differences in overall economic efficiency between firm classes are assessed through profitability, hence a profit function.^ Using the framework of the profit function, data from 1990 Medicaid Costs Reports for Texas, and the analytic technique of Ordinary Least Squares Regression, the findings of the study indicated (1) similar profitability between nursing facilities organized as for-profit versus non-profit and located in urban versus rural regions, (2) an inverse association between both payor-mix and patient case-mix with profitability, (3) strong evidence for the presence of scale economies, and (4) existence of a competitive market structure. The paper concludes with implications regarding reimbursement methodology and construction moratorium policies in Texas. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The desire to promote efficient allocation of health resources and effective patient care has focused attention on home care as an alternative to acute hospital service. in particular, clinical home care is suggested as a substitute for the final days of hospital stay. This dissertation evaluates the relationship between hospital and home care services for residents of British Columbia, Canada beginning in 1993/94 using data from the British Columbia Linked Health database. ^ Lengths of stay for patients referred to home care following hospital discharge are compared to those for patients not referred to home care. Ordinary least squares regression analysis adjusts for age, gender, admission severity, comorbidity, complications, income, and other patient, physician, and hospital characteristics. Home care clients tend to have longer stays in hospital than patients not referred to home care (β = 2.54, p = 0.0001). Longer hospital stays are evident for all home care client groups as well as both older and younger patients. Sensitivity analysis for referral time to direct care and extreme lengths of stay are consistent with these findings. Two stage regression analysis indicates that selection bias is not significant.^ Patients referred to clinical home care also have different health service utilization following discharge compared to patients not referred to home care. Home care nursing clients use more medical services to complement home care. Rehabilitation clients initially substitute home care for physiotherapy services but later are more likely to be admitted to residential care. All home care clients are more likely to be readmitted to hospital during the one year follow-up period. There is also a strong complementary association between direct care referral and homemaker support. Rehabilitation clients have a greater risk of dying during the year following discharge. ^ These results suggest that home care is currently used as a complement rather than a substitute for some acute health services. Organizational and resource issues may contribute to the longer stays by home care clients. Program planning and policies are required if home care is to provide an effective substitute for acute hospital days. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Application of biogeochemical models to the study of marine ecosystems is pervasive, yet objective quantification of these models' performance is rare. Here, 12 lower trophic level models of varying complexity are objectively assessed in two distinct regions (equatorial Pacific and Arabian Sea). Each model was run within an identical one-dimensional physical framework. A consistent variational adjoint implementation assimilating chlorophyll-a, nitrate, export, and primary productivity was applied and the same metrics were used to assess model skill. Experiments were performed in which data were assimilated from each site individually and from both sites simultaneously. A cross-validation experiment was also conducted whereby data were assimilated from one site and the resulting optimal parameters were used to generate a simulation for the second site. When a single pelagic regime is considered, the simplest models fit the data as well as those with multiple phytoplankton functional groups. However, those with multiple phytoplankton functional groups produced lower misfits when the models are required to simulate both regimes using identical parameter values. The cross-validation experiments revealed that as long as only a few key biogeochemical parameters were optimized, the models with greater phytoplankton complexity were generally more portable. Furthermore, models with multiple zooplankton compartments did not necessarily outperform models with single zooplankton compartments, even when zooplankton biomass data are assimilated. Finally, even when different models produced similar least squares model-data misfits, they often did so via very different element flow pathways, highlighting the need for more comprehensive data sets that uniquely constrain these pathways.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

16S rRNA genes and transcripts of Acidobacteria were investigated in 57 grassland and forest soils of three different geographic regions. Acidobacteria contributed 9-31% of bacterial 16S rRNA genes whereas the relative abundances of the respective transcripts were 4-16%. The specific cellular 16S rRNA content (determined as molar ratio of rRNA:rRNA genes) ranged between 3 and 80, indicating a low in situ growth rate. Correlations with flagellate numbers, vascular plant diversity and soil respiration suggest that biotic interactions are important determinants of Acidobacteria 16S rRNA transcript abundances in soils. While the phylogenetic composition of Acidobacteria differed significantly between grassland and forest soils, high throughput denaturing gradient gel electrophoresis and terminal restriction fragment length polymorphism fingerprinting detected 16S rRNA transcripts of most phylotypes in situ. Partial least squares regression suggested that chemical soil conditions such as pH, total nitrogen, C:N ratio, ammonia concentrations and total phosphorus affect the composition of this active fraction of Acidobacteria. Transcript abundance for individual Acidobacteria phylotypes was found to correlate with particular physicochemical (pH, temperature, nitrogen or phosphorus) and, most notably, biological parameters (respiration rates, abundances of ciliates or amoebae, vascular plant diversity), providing culture-independent evidence for a distinct niche specialization of different Acidobacteria even from the same subdivision.