1000 resultados para Cartera de valors -- Administració
Resumo:
The main object of the present paper consists in giving formulas and methods which enable us to determine the minimum number of repetitions or of individuals necessary to garantee some extent the success of an experiment. The theoretical basis of all processes consists essentially in the following. Knowing the frequency of the desired p and of the non desired ovents q we may calculate the frequency of all possi- ble combinations, to be expected in n repetitions, by expanding the binomium (p-+q)n. Determining which of these combinations we want to avoid we calculate their total frequency, selecting the value of the exponent n of the binomium in such a way that this total frequency is equal or smaller than the accepted limit of precision n/pª{ 1/n1 (q/p)n + 1/(n-1)| (q/p)n-1 + 1/ 2!(n-2)| (q/p)n-2 + 1/3(n-3) (q/p)n-3... < Plim - -(1b) There does not exist an absolute limit of precision since its value depends not only upon psychological factors in our judgement, but is at the same sime a function of the number of repetitions For this reasen y have proposed (1,56) two relative values, one equal to 1-5n as the lowest value of probability and the other equal to 1-10n as the highest value of improbability, leaving between them what may be called the "region of doubt However these formulas cannot be applied in our case since this number n is just the unknown quantity. Thus we have to use, instead of the more exact values of these two formulas, the conventional limits of P.lim equal to 0,05 (Precision 5%), equal to 0,01 (Precision 1%, and to 0,001 (Precision P, 1%). The binominal formula as explained above (cf. formula 1, pg. 85), however is of rather limited applicability owing to the excessive calculus necessary, and we have thus to procure approximations as substitutes. We may use, without loss of precision, the following approximations: a) The normal or Gaussean distribution when the expected frequency p has any value between 0,1 and 0,9, and when n is at least superior to ten. b) The Poisson distribution when the expected frequecy p is smaller than 0,1. Tables V to VII show for some special cases that these approximations are very satisfactory. The praticai solution of the following problems, stated in the introduction can now be given: A) What is the minimum number of repititions necessary in order to avoid that any one of a treatments, varieties etc. may be accidentally always the best, on the best and second best, or the first, second, and third best or finally one of the n beat treatments, varieties etc. Using the first term of the binomium, we have the following equation for n: n = log Riim / log (m:) = log Riim / log.m - log a --------------(5) B) What is the minimun number of individuals necessary in 01der that a ceratin type, expected with the frequency p, may appaer at least in one, two, three or a=m+1 individuals. 1) For p between 0,1 and 0,9 and using the Gaussean approximation we have: on - ó. p (1-p) n - a -1.m b= δ. 1-p /p e c = m/p } -------------------(7) n = b + b² + 4 c/ 2 n´ = 1/p n cor = n + n' ---------- (8) We have to use the correction n' when p has a value between 0,25 and 0,75. The greek letters delta represents in the present esse the unilateral limits of the Gaussean distribution for the three conventional limits of precision : 1,64; 2,33; and 3,09 respectively. h we are only interested in having at least one individual, and m becomes equal to zero, the formula reduces to : c= m/p o para a = 1 a = { b + b²}² = b² = δ2 1- p /p }-----------------(9) n = 1/p n (cor) = n + n´ 2) If p is smaller than 0,1 we may use table 1 in order to find the mean m of a Poisson distribution and determine. n = m: p C) Which is the minimun number of individuals necessary for distinguishing two frequencies p1 and p2? 1) When pl and p2 are values between 0,1 and 0,9 we have: n = { δ p1 ( 1-pi) + p2) / p2 (1 - p2) n= 1/p1-p2 }------------ (13) n (cor) We have again to use the unilateral limits of the Gaussean distribution. The correction n' should be used if at least one of the valors pl or p2 has a value between 0,25 and 0,75. A more complicated formula may be used in cases where whe want to increase the precision : n (p1 - p2) δ { p1 (1- p2 ) / n= m δ = δ p1 ( 1 - p1) + p2 ( 1 - p2) c= m / p1 - p2 n = { b2 + 4 4 c }2 }--------- (14) n = 1/ p1 - p2 2) When both pl and p2 are smaller than 0,1 we determine the quocient (pl-r-p2) and procure the corresponding number m2 of a Poisson distribution in table 2. The value n is found by the equation : n = mg /p2 ------------- (15) D) What is the minimun number necessary for distinguishing three or more frequencies, p2 p1 p3. If the frequecies pl p2 p3 are values between 0,1 e 0,9 we have to solve the individual equations and sue the higest value of n thus determined : n 1.2 = {δ p1 (1 - p1) / p1 - p2 }² = Fiim n 1.2 = { δ p1 ( 1 - p1) + p1 ( 1 - p1) }² } -- (16) Delta represents now the bilateral limits of the : Gaussean distrioution : 1,96-2,58-3,29. 2) No table was prepared for the relatively rare cases of a comparison of threes or more frequencies below 0,1 and in such cases extremely high numbers would be required. E) A process is given which serves to solve two problemr of informatory nature : a) if a special type appears in n individuals with a frequency p(obs), what may be the corresponding ideal value of p(esp), or; b) if we study samples of n in diviuals and expect a certain type with a frequency p(esp) what may be the extreme limits of p(obs) in individual farmlies ? I.) If we are dealing with values between 0,1 and 0,9 we may use table 3. To solve the first question we select the respective horizontal line for p(obs) and determine which column corresponds to our value of n and find the respective value of p(esp) by interpolating between columns. In order to solve the second problem we start with the respective column for p(esp) and find the horizontal line for the given value of n either diretly or by approximation and by interpolation. 2) For frequencies smaller than 0,1 we have to use table 4 and transform the fractions p(esp) and p(obs) in numbers of Poisson series by multiplication with n. Tn order to solve the first broblem, we verify in which line the lower Poisson limit is equal to m(obs) and transform the corresponding value of m into frequecy p(esp) by dividing through n. The observed frequency may thus be a chance deviate of any value between 0,0... and the values given by dividing the value of m in the table by n. In the second case we transform first the expectation p(esp) into a value of m and procure in the horizontal line, corresponding to m(esp) the extreme values om m which than must be transformed, by dividing through n into values of p(obs). F) Partial and progressive tests may be recomended in all cases where there is lack of material or where the loss of time is less importent than the cost of large scale experiments since in many cases the minimun number necessary to garantee the results within the limits of precision is rather large. One should not forget that the minimun number really represents at the same time a maximun number, necessary only if one takes into consideration essentially the disfavorable variations, but smaller numbers may frequently already satisfactory results. For instance, by definition, we know that a frequecy of p means that we expect one individual in every total o(f1-p). If there were no chance variations, this number (1- p) will be suficient. and if there were favorable variations a smaller number still may yield one individual of the desired type. r.nus trusting to luck, one may start the experiment with numbers, smaller than the minimun calculated according to the formulas given above, and increase the total untill the desired result is obtained and this may well b ebefore the "minimum number" is reached. Some concrete examples of this partial or progressive procedure are given from our genetical experiments with maize.
Resumo:
Totes les universitats públiques catalanes, entre d'altres institucions, fan servir el programari VTLS per a la gestió de les seves biblioteques. Per a aquesta funció, cada universitat disposa d'un sistema propi en què esta instal·lada localment la base de dades del seu catàleg. El CBUC posseeix el catàleg col·lectiu, resultat de la fusió de tots els fons. En base a aquest últim es realitza la catalogació per còpia i la consulta global de tots els fons simultàniament. Aquest sistema de col·laboració ha estat pioner des de fa molts anys a Espanya, seguint les principals tendències internacionals. Però, en aquests moments, raons tecnològiques i de funcionalitat fan inevitable un canvi de sistema, davant el qual els membres del CBUC estan oberts a noves solucions tècniques i organitzatives que els proporcionin nous avantatges funcionals i econòmics. Per això, el CBUC ha analitzat l'oportunitat d'obrir un nou marc de col·laboració que es fonamentaria en l'establiment d'una instal·lació única i comuna per allotjar el nou sistema informàtic. Un cop analitzats els diferents escenaris possibles, l'opció que es proposa és tenir una aplicació individual per a cada institució compartint un únic sistema informàtic. L'èxit d'experiències internacionals similars avalen una col·laboració com aquesta. És especialment representativa la finlandesa: totes les biblioteques universitàries feien servir des de fa anys el programari VTLS per gestionar-se i van evolucionar conjuntament cap a un nou sistema comú, integrant els disset servidors locals de cada institució en un únic sistema global per a totes (Linnea2). Aquesta col·laboració suposaria l'adquisició d'un únic sistema informàtic que s'instal·laria al CESCA per allotjar les aplicacions de les diferents institucions. Aquestes hi accedirien a través de l'Anella Científica. L'administració de les aplicacions la faria el CBUC. Els principals avantatges que aportaria aquesta solució són: · Increment substancial de la disponibilitat del sistema. · Estalvis econòmics importants (entre 1.742.000 i 2.149.000 euros en cinc anys).
Resumo:
A theory of network-entrepreneurs or "spin-off system" is presented in this paper for the creation of firms based on the community’s social governance. It is argued that firm’s capacity for accumulation depends on the presence of employees belonging to the same social/ethnic group with expectations of "inheriting" the firm and becoming entrepreneurs once they have been selected for their merits and loyalty towards their patrons. Such accumulation is possible because of the credibility of the patrons’ promises of supporting newcomers due to high social cohesion and specific social norms prevailing in the community. This theory is exemplified through the case of the Barcelonnettes, a group of immigrants from the Alps in the South of France (Provence) who came to Mexico in the XIX Century.
Aplicación del DEA en el análisis de beneficios en un sistema integrado verticalmente hacia adelante
Resumo:
En el presente trabajo se diseñan tres modelos DEA a partir de un sistema de producción cuyos componentes están colocados en un arreglo en serie que se integran verticalmente hacia adelante. El primer modelo busca optimizar los beneficios del sistema agregado, así como la mejora de los mismos en cada uno de los subsistemas. En el segundo de los modelos, además del objetivo anterior, se incluyen restricciones de transferencia de los recursos específicos asociados a cada subsistema, y en el tercer modelo se estima el intervalo de variación para los precios de transferencia de los inputs intermedios entre ambos subsistemas. Los modelos han sido programados y simulados en el software GAMS a partir de datos generados por una función de producción Cobb-Douglas para los inputs intermedios y los outputs finales.
Resumo:
Does shareholder value orientation lead to shareholder value creation? This article proposes methods to quantify both, shareholder value orientation and shareholder value creation. Through the application of these models it is possible to quantify both dimensions and examine statistically in how far shareholder value orientation explains shareholder value creation. The scoring model developed in this paper allows quantifying the orientation of managers towards the objective to maximize wealth of shareholders. The method evaluates information that comes from the companies and scores the value orientation in a scale from 0 to 10 points. Analytically the variable value orientation is operationalized expressing it as the general attitude of managers toward the objective of value creation, investment policy and behavior, flexibility and further eight value drivers. The value creation model works with market data such as stock prices and dividend payments. Both methods where applied to a sample of 38 blue chip companies: 32 firms belonged to the share index IBEX 35 on July 1st, 1999, one company represents the “new economy” listed in the Spanish New Market as per July 1st, 2001, and 5 European multinational groups formed part of the EuroStoxx 50 index also on July 1st, 2001. The research period comprised the financial years 1998, 1999, and 2000. A regression analysis showed that between 15.9% and 23.4% of shareholder value creation can be explained by shareholder value orientation.
Resumo:
Este trabajo de investigación se dirige sobre el ámbito de los gobiernos locales y su objetivo es el de analizar cuál es el papel del endeudamiento en los ciclos políticos presupuestarios, contrastando si, en la utilización de este instrumento financiero, se da una distribución temporal estratégica en torno a las citas electorales, y si la existencia de estos ciclos puede resultar un factor explicativo del endeudamiento acumulado en los gobiernos locales. Para el contraste empírico se utilizan datos presupuestarios de los ayuntamientos catalanes con una población superior a los 10.000 habitantes, para los cuales disponemos de datos durante el periodo 1988-1999, en total 86 municipios. La metodología empleada está basada en una Prueba T para muestras relacionadas.
Resumo:
This paper examines the governance of Spanish Banks around two main issues. First, does a poor economic performance activate those governance interventions that favor the removal of executive directors and the merger of non-performing banks? And second, does the relationship between governance intervention and economic performance vary with the ownership form of the bank? Our results show that a bad performance does activate governance mechanisms in banks, although for the case of Savings Banks intervention is confined to a merger or acquisition. Nevertheless, the distinct ownership structure of Savings Banks does not fully protect non-performing banks from disappearing. Product-market competition compensates for those weak internal governance mechanisms that result from an ownership form which gives voice to several stakeholder groups.
Resumo:
This paper proposes a two-dimensional Strategic Performance Measure (SPM) to evaluate the achievement of sustained superior performance. This proposal builds primarily on the fact that, under the strategic management perspective, a firm's prevalent objective is the pursuit of sustained superior performance. Three basic conceptual dimensions stem from this objective: relativity, sign dependence, and dynamism. These are the foundations of the SPM, which carries out a separate evaluation of the attained superior performance and of its sustainability over time. In contrast to existing measures of performance, the SPM provides: (i) a dynamic approach by considering the progress or regress in performance over time, and (ii) a cardinal measurement of performance differences and its changes over time. The paper also proposes an axiomatic framework that a measure of strategic performance should comply with to be theoretically and managerially sound. Finally, an empirical illustration of the Spanish banking sector during 1987-1999 is herein provided by discussing some relevant cases.
Resumo:
This study analyses efficiency levels in Spanish local governments and their determining factors through the application of DEA (Data Envelopment Analysis) methodology. It aims to find out to what extent inefficiency arises from external factors beyond the control of the entity, or on the other hand, how much it is due to inadequate management of productive resources. The results show that on the whole, there is still a wide margin within which managers could increase local government efficiency levels, although it is revealed that a great deal of inefficiency is due to exogenous factors. It is specifically found that the size of the entity, per capita tax revenue, the per capita grants or the amount of commercial activity are some of the factors determining local government inefficiency.
Resumo:
This paper analyzes the employment relationship on the basis of the notion of access. We argue that the degree of access provided by a job is an incentive to activate the employee’s self-actualization needs. We investigate the effect of access on the workers’ performance through an agency model and provide a number of propositions with practical implications for personnel policies. Our results are consistent with the intuition emerged from the real business practice as well as with many of the arguments on the substitutive role between monetary and non-monetary incentives frequently reported in the literature.
Resumo:
This paper investigates the selection of governance forms in interfirm collaborations taking into account the predictions from transaction costs and property rights theories. Transaction costs arguments are often used to justify the introduction of hierarchical controls in collaborations, but the ownership dimension of going from “contracts” to “hierarchies” has been ignored in the past and with it the so called “costs of ownership”. The theoretical results, tested with a sample of collaborations in which participate Spanish firms, indicate that the cost of ownership may offset the benefits of hierarchical controls and therefore limit their diffusion. Evidence is also reported of possible complementarities between reputation effects and forms of ownership that go together with hierarchical controls (i.e. joint ventures), in contrast with the generally assumed substitutability between the two.
Resumo:
Material throughput is a means of measuring the so-called social metabolism, or physical dimensions of a society’s consumption, and can be taken as an indirect and approximate indicator of sustainability. Material flow accounting can be used to test the dematerialisation hypothesis, the idea that technological progress causes a decrease in total material used (strong dematerialisation) or material used per monetary unit of output (weak dematerialisation). This paper sets out the results of a material flow analysis for Spain for the period from 1980 to 2000. The analysis reveals that neither strong nor weak dematerialisation took place during the period analysed. Although the population did not increase considerably, materials mobilised by the Spanish economy (DMI) increased by 85% in absolute terms, surpassing GDP growth. In addition, Spain became more dependent on external trade in physical terms. In fact, its imports are more than twice the amount of its exports in terms of weight.
Resumo:
L'estudi de la relació entre l'activitat econòmica i el medi que ens envolta és antiga en economia. No obstant és cert que darrerament la teoria econòmica sembla haver-la oblidada. És per això que quan sortí la disciplina que avui s’anomena "economia ecològica" va implicar una ruptura amb la manera amb la qual la teoria econòmica convencional descrivia nostra relació amb el medi ambient. El present capítol pretén descriure breument el que creiem són les característiques principals que la fan diferent de la resta de disciplines: la inconmensurabilitat de valors, la seva anàlisi en termes biofísics, i les seves repercussions al nivell de la generació de polítiques, que la fan ser un exemple del que es coneix com a ciència post-normal.