882 resultados para mixed stock analysis
Resumo:
In spite of the far longed practices of technical analysis by many participants in Indian stock market, none have arrived at the exact position of technical analysis as a tool for foretelling share prices. There is no evidence supporting that one has established its definite role in predicting the behaviour of share price and also to see the extent of validity (how far reliable) of technical tools in Indian stock market. The problem is the vacuum in the arena of securities market analysis where an unrecognised tool is practised, i.e., whether to hold on to technical analysis or to drop it. Again, as already stated in this chapter, its validity need not continue forever. It may become futile as happened in developed markets. Continuous practice of a tool, which is valid only during discontinuous times is also an error. The efficacy of different market phenomena in terms of their ability to foretell the extent and direction of the price movements and reliability thereof remain as not yet proved in. This requires further study in this area so that this controversy may be settled. A solution to the problem requires enquiring and establishing the applicability of technical analysis, if any, there is in the Indian stock market. The study has the following two broad objectives for the purpose of confirming the applicability, if any, of technical analysis in the Indian stock market. The first objective is to ascertain the current validity of ‘traditional holding with respect to patterns’ and the second objective is to ascertain the ‘consistent superiority’, if any, of technical indicators over non-signal strategies in return generation. The study analyses the five patterns, which are widely known and commonly found in publications. They are: (1) Symmetrical Triangles, (2) Rising Wedges, (3) Falling Wedges, (4) Head and Shoulders Top and (5) Head and Shoulders Bottom.
Resumo:
A study focusing on the identification of return generating factors and to the extent of their influence on share prices the outcome will be a tool for investment analysis in the hands of investors portfolio managers and mutual funds who are mostly concerned with changing share prices. Since the study takes into account the influence of macroeconomic variables on variations in share returns by using the outcome the government can frame out suitable policies on long term basis and that will help in nurturing a healthy economy and resultant stock market. As every company management tries to maximize the wealth of the share holders a clear idea about the return generating variables and their influence will help the management to frame various policies to maximize the wealth of the shareholders.
Resumo:
Knowledge discovery in databases is the non-trivial process of identifying valid, novel potentially useful and ultimately understandable patterns from data. The term Data mining refers to the process which does the exploratory analysis on the data and builds some model on the data. To infer patterns from data, data mining involves different approaches like association rule mining, classification techniques or clustering techniques. Among the many data mining techniques, clustering plays a major role, since it helps to group the related data for assessing properties and drawing conclusions. Most of the clustering algorithms act on a dataset with uniform format, since the similarity or dissimilarity between the data points is a significant factor in finding out the clusters. If a dataset consists of mixed attributes, i.e. a combination of numerical and categorical variables, a preferred approach is to convert different formats into a uniform format. The research study explores the various techniques to convert the mixed data sets to a numerical equivalent, so as to make it equipped for applying the statistical and similar algorithms. The results of clustering mixed category data after conversion to numeric data type have been demonstrated using a crime data set. The thesis also proposes an extension to the well known algorithm for handling mixed data types, to deal with data sets having only categorical data. The proposed conversion has been validated on a data set corresponding to breast cancer. Moreover, another issue with the clustering process is the visualization of output. Different geometric techniques like scatter plot, or projection plots are available, but none of the techniques display the result projecting the whole database but rather demonstrate attribute-pair wise analysis
Resumo:
Die Schaffung der Europäischen Währungsunion hatte in wissenschaftlichen Kreisen heftigste Kontroversen ausgelöst. Der vorliegende Beitrag unternimmt, dreieinhalb Jahre nach dem Beginn der Währungsunion, eine Bestandsaufnahme der Entwicklungen in den diesbezüglich zugrundeliegenden zentralen Problembereichen: den Arbeitsmärkten, der Inflationsentwicklung und der Budgetproblematik. Eine Analyse der Thesen im Lichte der bisherigen makroökonomischen Daten führt zu einem gemischten Urteil: während die Entwicklungen auf den Arbeitsmärkten und im Inflationsbereeich bislang in der Tendenz positiv sind, stagniert die angestrebte Besserung bei der Budget- und Schuldenproblematik. Da die weltwirtschaftlichen Einflüsse bislang günstig bzw. für alle Teilnehmerländer von ähnlicher Wirkung waren, steht ein harter Test des makroökonomischen Rahmens aber noch aus.
Resumo:
The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).
Resumo:
Soil fertility constraints to crop production have been recognized widely as a major obstacle to food security and agro-ecosystem sustainability in sub-Saharan West Africa. As such, they have led to a multitude of research projects and policy debates on how best they should be overcome. Conclusions, based on long-term multi-site experiments, are lacking with respect to a regional assessment of phosphorus and nitrogen fertilizer effects, surface mulched crop residues, and legume rotations on total dry matter of cereals in this region. A mixed model time-trend analysis was used to investigate the effects of four nitrogen and phosphorus rates, annually applied crop residue dry matter at 500 and 2000 kg ha^-1, and cereal-legume rotation versus continuous cereal cropping on the total dry matter of cereals and legumes. The multi-factorial experiment was conducted over four years at eight locations, with annual rainfall ranging from 510 to 1300 mm, in Niger, Burkina Faso, and Togo. With the exception of phosphorus, treatment effects on legume growth were marginal. At most locations, except for typical Sudanian sites with very low base saturation and high rainfall, phosphorus effects on cereal total dry matter were much lower with rock phosphate than with soluble phosphorus, unless the rock phosphate was combined with an annual seed-placement of 4 kg ha^-1 phosphorus. Across all other treatments, nitrogen effects were negligible at 500 mm annual rainfall but at 900 mm, the highest nitrogen rate led to total dry matter increases of up to 77% and, at 1300 mm, to 183%. Mulch-induced increases in cereal total dry matter were larger with lower base saturation, reaching 45% on typical acid sandy Sahelian soils. Legume rotation effects tended to increase over time but were strongly species-dependent.
Resumo:
During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.
Resumo:
We take stock of the present position of compositional data analysis, of what has been achieved in the last 20 years, and then make suggestions as to what may be sensible avenues of future research. We take an uncompromisingly applied mathematical view, that the challenge of solving practical problems should motivate our theoretical research; and that any new theory should be thoroughly investigated to see if it may provide answers to previously abandoned practical considerations. Indeed a main theme of this lecture will be to demonstrate this applied mathematical approach by a number of challenging examples
Resumo:
What are fundamental entities in social networks and what information is contained in social graphs? We will discuss some selected concepts in social network analysis, such as one- and two mode networks, prestige and centrality, and cliques, clans and clubs. Readings: Web tool predicts election results and stock prices, J. Palmer, New Scientist, 07 February (2008) [Protected Access] Optional: Social Network Analysis, Methods and Applications, S. Wasserman and K. Faust (1994)
Resumo:
Financial integration has been pursued aggressively across the globe in the last fifty years; however, there is no conclusive evidence on the diversification gains (or losses) of such efforts. These gains (or losses) are related to the degree of comovements and synchronization among increasingly integrated global markets. We quantify the degree of comovements within the integrated Latin American market (MILA). We use dynamic correlation models to quantify comovements across securities as well as a direct integration measure. Our results show an increase in comovements when we look at the country indexes, however, the increase in the trend of correlation is previous to the institutional efforts to establish an integrated market in the region. On the other hand, when we look at sector indexes and an integration measure, we find a decreased in comovements among a representative sample of securities form the integrated market.
Resumo:
We use a large firm level data set to investigate the determinants of foreign direct investment(FDI) in Colombia. We estimate econometric models for the determinants of the probabilitythat a firm receives FDI, as well as for the factors that help to explain the foreign share in afirm’s capital. The results show that firms listed on the stock market, involved in foreign tradeactivities, and operating in sectors with greater capital intensity are more likely to be recipientsof FDI. Also, the probability of a firm receiving FDI is directly related to its size.
Resumo:
The problem of stability analysis for a class of neutral systems with mixed time-varying neutral, discrete and distributed delays and nonlinear parameter perturbations is addressed. By introducing a novel Lyapunov-Krasovskii functional and combining the descriptor model transformation, the Leibniz-Newton formula, some free-weighting matrices, and a suitable change of variables, new sufficient conditions are established for the stability of the considered system, which are neutral-delay-dependent, discrete-delay-range dependent, and distributeddelay-dependent. The conditions are presented in terms of linear matrix inequalities (LMIs) and can be efficiently solved using convex programming techniques. Two numerical examples are given to illustrate the efficiency of the proposed method
Resumo:
The objective of this paper is to introduce a diVerent approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. Design—The approach is illustrated by providing quantitative estimates of the short-termeVects of air pollution on mortality in three Spanish cities, Barcelona,Valencia and Vigo, for the period 1992–1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were nonlinear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. Main results—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 μg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 μg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant,especially those among air pollutants and meteorological variables. Conclusions—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities
Resumo:
We run a standard income convergence analysis for the last decade and confirm an already established finding in the growth economics literature. EU countries are converging. Regions in Europe are also converging. But, within countries, regional disparities are on the rise. At the same time, there is probably no reason for EU Cohesion Policy to be concerned with what happens inside countries. Ultimately, our data shows that national governments redistribute well across regions, whether they are fiscally centralised or decentralised. It is difficult to establish if Structural and Cohesion Funds play any role in recent growth convergence patterns in Europe. Generally, macroeconomic simulations produce better results than empirical tests. It is thus possible that Structural Funds do not fully realise their potential either because they are not efficiently allocated or are badly managed or are used for the wrong investments, or a combination of all three. The approach to assess the effectiveness of EU funds should be consistent with the rationale behind the post-1988 EU Cohesion Policy. Standard income convergence analysis is certainly not sufficient and should be accompanied by an assessment of the changes in the efficiency of the capital stock in the recipient countries or regions as well as by a more qualitative assessment. EU funds for competitiveness and employment should be allocated by looking at each region’s capital efficiency to maximise growth generating effects or on a pure competitive.
Resumo:
The influence of surface waves and an applied wind stress is studied in an ensemble of large eddy simulations to investigate the nature of deeply penetrating jets into an unstratified mixed layer. The influence of a steady monochromatic surface wave propagating parallel to the wind direction is parameterized using the wave-filtered Craik-Leibovich equations. Tracer trajectories and instantaneous downwelling velocities reveal classic counterrotating Langmuir rolls. The associated downwelling jets penetrate to depths in excess of the wave's Stokes depth scale, δs. Qualitative evidence suggests the depth of the jets is controlled by the Ekman depth scale. Analysis of turbulent kinetic energy (tke) budgets reveals a dynamical distinction between Langmuir turbulence and shear-driven turbulence. In the former, tke production is dominated by Stokes shear and a vertical flux term transports tke to a depth where it is dissipated. In the latter, tke production is from the mean shear and is locally balanced by dissipation. We define the turbulent Langmuir number Lat = (v*/Us)0.5 (v* is the ocean's friction velocity and Us is the surface Stokes drift velocity) and a turbulent anisotropy coefficient Rt = /( + ). The transition between shear-driven and Langmuir turbulence is investigated by varying external wave parameters δs and Lat and by diagnosing Rt and the Eulerian mean and Stokes shears. When either Lat or δs are sufficiently small the Stokes shear dominates the mean shear and the flow is preconditioned to Langmuir turbulence and the associated deeply penetrating jets.