29 resultados para Socio-technical styles of production
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Paper presented at the 2001 seminar of the International Chair in Olympism (IOC-UAB). The seminar offers a general reflection, from the time of the bid onwards, of the 2000 Games experience in Sydney and Australia.
Resumo:
Informe de investigación realizado a partir de una estancia en la University of London entre el 3 de marzo al 10 de abril 2007. Redacción de un artículo sobre aspectos metodológicos centrales para las ciencias sociales en su vertiente tanto teórica como aplicada: la articulación entre la investigación etnográfica y los modelos abstractos. Tanto la etnografía en sus múltiples formas de describir la realidad observable, como los modelos en su intento por reducir la complejidad con el fin de subrayar las conexiones causales son instrumentos de las ciencias sociales. Los modelos cambian el mundo: gracias a su cualidad abstracta pueden presentar no sólo una imagen de cómo funcionan las cosas, sino también subrayar el aspecto procesual de las conexiones permitiendo de este modo establecer proposiciones prospectivas y guiar las políticas públicas de desarrollo. En la base de la acción encontramos siempre alguna forma de modelización, incluso en el ámbito de las disposiciones subjetivas que mueven a la gente a la toma de decisiones cotidianas. A menudo la realidad escapa a la matriz de los modelos, sin embargo, y el cambio y la adaptación toman caminos insospechados y no planificados. Este proyecto busca construir la posibilidad de un diálogo constructivo, creativo y no-jerárquico entre los modelos de desarrollo económico y la etnografía(...)
Resumo:
Recent decisions by the Spanish national competition authority (TDC) mandate payment systems to include only two costs when setting their domestic multilateral interchange fees (MIF): a fixed processing cost and a variable cost for the risk of fraud. This artificial lowering of MIFs will not lower consumer prices, because of uncompetitive retailing; but it will however lead to higher cardholders fees and, likely, new prices for point of sale terminals, delaying the development of the immature Spanish card market. Also, to the extent that increased cardholders fees do not offset the fall in MIFs revenue, the task of issuing new cards will be underpaid relatively to the task of acquiring new merchants, causing an imbalance between the two sides of the networks. Moreover, the pricing scheme arising from the decisions will cause unbundling and underprovision of those services whose costs are excluded. Indeed, the payment guarantee and the free funding period will tend to be removed from the package of services currently provided, to be either provided by third parties, by issuers for a separate fee, or not provided at all, especially to smaller and medium-sized merchants. Transaction services will also suffer the consequences that the TDC precludes pricing them in variable terms.
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitative finance. To do so, we conduct an ethnography of arbitrage, the trading strategy that best exemplifies finance in the wake of the quantitative revolution. In contrast to value and momentum investing, we argue, arbitrage involves an art of association-the construction of equivalence (comparability) of properties across different assets. In place of essential or relational characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else-associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Resumo:
We find that over the period 1950-1990, US states absorbed increases in the supplyof schooling due to tighter compulsory schooling and child labor laws mostly throughwithin-industry increases in the schooling intensity of production. Shifts in the industrycomposition towards more schooling-intensive industries played a less important role.To try and understand this finding theoretically, we consider a free trade model withtwo goods/industries, two skill types, and many regions that produce a fixed rangeof differentiated varieties of the same goods. We find that a calibrated version ofthe model can account for shifts in schooling supply being mostly absorbed throughwithin-industry increases in the schooling intensity of production even if the elasticityof substitution between varieties is substantially higher than estimates in the literature.
Resumo:
Do high levels of human capital foster economic growth by facilitating technology adoption? If so, countries with more human capital should have adopted more rapidly the skilled-labor augmenting technologies becoming available since the 1970 s. High human capital levels should therefore have translated into fast growth in more compared to less human-capital-intensive industries in the 1980 s. Theories of international specialization point to human capital accumulation as another important determinant of growth in human-capital-intensive industries. Using data for a large sample of countries, we find significant positive effects of human capital levels and human capital accumulation on output and employment growth in human-capital-intensive industries.
Resumo:
I study the optimal project choice when the principal relies on the agent in charge of production for project evaluation. The principal has to choose between a safe project generating a fixed revenue and a risky project generating an uncertain revenue. The agent has private information about the production cost under each project but also about the signal regarding the profitability of the risky project. If the signal favoring the adoption of the risky project is goods news to the agent, integrating production and project evaluation tasks does not generate any loss compared to the benchmark in which the principal herself receives the signal. By contrast, if it is bad news, task integration creates an endogenous reservation utility which is type-dependent and thereby generates countervailing incentives, which can make a bias toward either project optimal. Our results can offer an explanation for why good firms can go bad and a rationale for the separation of day-to-day operating decisions from long-term strategic decisions stressed by Williamson.
Resumo:
Earthquakes represent a major hazard for populations around the world, causing frequent loss of life,human suffering and enormous damage to homes, other buildings and infrastructure. The Technology Resources forEarthquake Monitoring and Response (TREMOR) Team of 36 space professionals analysed this problem over thecourse of the International Space University Summer Session Program and published their recommendations in the formof a report. The TREMOR Team proposes a series of space- and ground-based systems to provide improved capabilityto manage earthquakes. The first proposed system is a prototype earthquake early-warning system that improves theexisting knowledge of earthquake precursors and addresses the potential of these phenomena. Thus, the system willat first enable the definitive assessment of whether reliable earthquake early warning is possible through precursormonitoring. Should the answer be affirmative, the system itself would then form the basis of an operational earlywarningsystem. To achieve these goals, the authors propose a multi-variable approach in which the system will combine,integrate and process precursor data from space- and ground-based seismic monitoring systems (already existing andnew proposed systems) and data from a variety of related sources (e.g. historical databases, space weather data, faultmaps). The second proposed system, the prototype earthquake simulation and response system, coordinates the maincomponents of the response phase to reduce the time delays of response operations, increase the level of precisionin the data collected, facilitate communication amongst teams, enhance rescue and aid capabilities and so forth. It isbased in part on an earthquake simulator that will provide pre-event (if early warning is proven feasible) and post-eventdamage assessment and detailed data of the affected areas to corresponding disaster management actors by means of ageographic information system (GIS) interface. This is coupled with proposed mobile satellite communication hubs toprovide links between response teams. Business- and policy-based implementation strategies for these proposals, suchas the establishment of a non-governmental organisation to develop and operate the systems, are included.
Resumo:
The factor structure of a back translated Spanish version (Lega, Caballo and Ellis, 2002) of the Attitudes and Beliefs Inventory (ABI) (Burgess, 1990) is analyzed in a sample of 250 university students.The Spanish version of the ABI is a 48-items self-report inventory using a 5-point Likert scale that assesses rational and irrational attitudes and beliefs. 24-items cover two dimensions of irrationality: a) areas of content (3 subscales), and b) styles of thinking (4 subscales).An Exploratory Factor Analysis (Parallel Analysis with Unweighted Least Squares method and Promin rotation) was performed with the FACTOR 9.20 software (Lorenzo-Seva and Ferrando, 2013).The results reproduced the main four styles of irrational thinking in relation with the three specific contents of irrational beliefs. However, two factors showed a complex configuration with important cross-loadings of different items in content and style. More analyses are needed to review the specific content and style of such items.
Resumo:
The factor structure of a back translated Spanish version (Lega, Caballo and Ellis, 2002) of the Attitudes and Beliefs Inventory (ABI) (Burgess, 1990) is analyzed in a sample of 250 university students.The Spanish version of the ABI is a 48-items self-report inventory using a 5-point Likert scale that assesses rational and irrational attitudes and beliefs. 24-items cover two dimensions of irrationality: a) areas of content (3 subscales), and b) styles of thinking (4 subscales).An Exploratory Factor Analysis (Parallel Analysis with Unweighted Least Squares method and Promin rotation) was performed with the FACTOR 9.20 software (Lorenzo-Seva and Ferrando, 2013).The results reproduced the main four styles of irrational thinking in relation with the three specific contents of irrational beliefs. However, two factors showed a complex configuration with important cross-loadings of different items in content and style. More analyses are needed to review the specific content and style of such items.
Resumo:
What determines which inputs are initially considered and eventually adopted in the productionof new or improved goods? Why are some inputs much more prominent than others? We modelthe evolution of input linkages as a process where new producers first search for potentially usefulinputs and then decide which ones to adopt. A new product initially draws a set of 'essentialsuppliers'. The search stage is then confined to the network neighborhood of the latter, i.e., to theinputs used by the essential suppliers. The adoption decision is driven by a tradeoff between thebenefits accruing from input variety and the costs of input adoption. This has important implicationsfor the number of forward linkages that a product (input variety) develops over time. Inputdiffusion is fostered by network centrality ? an input that is initially represented in many networkneighborhoods is subsequently more likely to be adopted. This mechanism also delivers a powerlaw distribution of forward linkages. Our predictions continue to hold when varieties are aggregatedinto sectors. We can thus test them, using detailed sectoral US input-output tables. We showthat initial network proximity of a sector in 1967 significantly increases the likelihood of adoptionthroughout the subsequent four decades. The same is true for rapid productivity growth in aninput-producing sector. Our empirical results highlight two conditions for new products to becomecentral nodes: initial network proximity to prospective adopters, and technological progress thatreduces their relative price. Semiconductors met both conditions.
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitativefinance. To do so, we conduct an ethnography of arbitrage, the trading strategy that bestexemplifies finance in the wake of the quantitative revolution. In contrast to value andmomentum investing, we argue, arbitrage involves an art of association - the constructionof equivalence (comparability) of properties across different assets. In place of essentialor relationa l characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else - associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Resumo:
Marx and the writers that followed him have produced a number of theories of the breakdown of capitalism. The majority of these theories were based on the historical tendencies: the rise in the composition of capital and the share of capital and the fall in the rate of profit. However these theories were never modeled with main stream rigour. This paper presents a constant wage model, with capital, labour and land as factors of production, which reproduces the historical tendencies and so can be used as a foundation for the various theories. The use of Chaplygins theorem in the proof of the main result also gives the paper a technical interest.
Resumo:
This paper sets out a Marxian model that is based on the one by Stephen Marglin with one sector and continuous substitution. It is extended by adding technical progress and land as a factor of production. It is then shown that capital accumulation causes the preconditions for the breakdown of capitalism to emerge; that is, it causes the organic composition of capital to rise, the rate of profit to fall and the rate of exploitation to rise. A compressed history of the idea of the breakdown of capitalism is then set out and an explanation is given as to how the model relates to this and how it may serve as the basis for further research.
Resumo:
In most initiatives to publish Open Educational Resources (OER), the production of OER is the activity with the highest costs. Based on literature and personal experiences a list of relevant characteristics of production processes for OER are determined. Three cases are compared with each other on these characteristics. Most influence on costs are human costs and the type of OER created.