60 resultados para Implicit techniques
Resumo:
Alan S. Milward was an economic historian who developed an implicit theory ofhistorical change. His interpretation which was neither liberal nor Marxist positedthat social, political, and economic change, for it to be sustainable, had to be agradual process rather than one resulting from a sudden, cataclysmicrevolutionary event occurring in one sector of the economy or society. Benignchange depended much less on natural resource endowment or technologicaldevelopments than on the ability of state institutions to respond to changingpolitical demands from within each society. State bureaucracies were fundamentalto formulating those political demands and advising politicians of ways to meetthem. Since each society was different there was no single model of developmentto be adopted or which could be imposed successfully by one nation-state onothers, either through force or through foreign aid programs. Nor coulddevelopment be promoted simply by copying the model of a more successfuleconomy. Each nation-state had to find its own response to the political demandsarising from within its society. Integration occurred when a number of nation states shared similar political objectives which they could not meet individuallybut could meet collectively. It was not simply the result of their increasinginterdependence. It was how and whether nation-states responded to thesedomestic demands which determined the nature of historical change.
Resumo:
Economists and economic historians want to know how much better life is today than in the past.Fifty years ago economic historians found surprisingly small gains from 19th century US railroads,while more recently economists have found relatively large gains from electricity, computers and cellphones. In each case the implicit or explicit assumption is that researchers were measuring the valueof a new good to society. In this paper we use the same techniques to find the value to society ofmaking existing goods cheaper. Henry Ford did not invent the car, and the inventors of mechanisedcotton spinning in the industrial revolution invented no new product. But both made existing productsdramatically cheaper, bringing them into the reach of many more consumers. That in turn haspotentially large welfare effects. We find that the consumer surplus of Henry Ford s production linewas around 2% by 1923, 15 years after Ford began to implement the moving assembly line, while themechanisation of cotton spinning was worth around 6% by 1820, 34 years after its initial invention.Both are large: of the same order of magnitude as consumer expenditure on these items, and as largeor larger than the value of the internet to consumers. On the social savings measure traditionally usedby economic historians, these process innovations were worth 15% and 18% respectively, makingthem more important than railroads. Our results remind us that process innovations can be at least asimportant for welfare and productivity as the invention of new products.
Resumo:
Recent empirical findings suggest that spreads quoted in dealershipmarkets might be uncompetitive. This paper analyzes theoretically if pricecompetition between risk--averse market--makers leaves room for implicitcollusive behavior. We compare the spread and risk--sharing efficiencyarising in several market structures differing in terms of i) the priorityrule followed in case of ties, and ii) the type of schedules market makersmay use, namely: general schedules, linear schedules, or limit orders. Ingeneral, competitive pricing does not arise in equilibrium, and there isa conflict between risk sharing efficiency and the tightness of the spread.This conflict can be mitigated by an appropriate market structure design.The limit order market is the only market structure in which the competitiveequilibrium is the unique equilibrium.
Resumo:
There are two fundamental puzzles about trade credit: why does it appearto be so expensive,and why do input suppliers engage in the business oflending money? This paper addresses and answers both questions analysingthe interaction between the financial and the industrial aspects of thesupplier-customer relationship. It examines how, in a context of limitedenforceability of contracts, suppliers may have a comparative advantageover banks in lending to their customers because they hold the extrathreat of stopping the supply of intermediate goods. Suppliers may alsoact as lenders of last resort, providing insurance against liquidityshocks that may endanger the survival of their customers. The relativelyhigh implicit interest rates of trade credit result from the existenceof default and insurance premia. The implications of the model areexamined empirically using parametric and nonparametric techniques on apanel of UK firms.
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
The current state of regional and urban science has been much discussed and a number of studies have speculated on possible future trends in the development of the discipline. However, there has been little empirical analysis of current publication patterns in regional and urban journals. This paper studies the kinds of topics, techniques and data used in articles published in nine top international journals during the 1990s with the aim of identifying current trends in this research field
Resumo:
The current state of regional and urban science has been much discussed and a number of studies have speculated on possible future trends in the development of the discipline. However, there has been little empirical analysis of current publication patterns in regional and urban journals. This paper studies the kinds of topics, techniques and data used in articles published in nine top international journals during the 1990s with the aim of identifying current trends in this research field
Resumo:
A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image processing. This paper provides the theoretical background and technical information for performing the experiment. The proposed activity requires students able to develop a wide range of skills since they are expected to deal with optical components, including spatial light modulators, and develop scripts to perform some calculations.
Resumo:
Surface topography and light scattering were measured on 15 samples ranging from those having smooth surfaces to others with ground surfaces. The measurement techniques included an atomic force microscope, mechanical and optical profilers, confocal laser scanning microscope, angle-resolved scattering, and total scattering. The samples included polished and ground fused silica, silicon carbide, sapphire, electroplated gold, and diamond-turned brass. The measurement instruments and techniques had different surface spatial wavelength band limits, so the measured roughnesses were not directly comparable. Two-dimensional power spectral density (PSD) functions were calculated from the digitized measurement data, and we obtained rms roughnesses by integrating areas under the PSD curves between fixed upper and lower band limits. In this way, roughnesses measured with different instruments and techniques could be directly compared. Although smaller differences between measurement techniques remained in the calculated roughnesses, these could be explained mostly by surface topographical features such as isolated particles that affected the instruments in different ways.
Resumo:
En la investigació de la complexació de metalls mitjançant eines electroanalítiques són emprades dues aproximacions generals. La primera, anomenada de modelatge dur (hardmodelling), es basa en la formulació d'un model fisicoquímic conjunt per als processos electròdic i de complexació i en la resolució analítica o numèrica del model. Posteriorment, l'ajust dels paràmetres del model a les dades experimentals donarà la informació desitjada sobre el procés de complexació. La segona aproximació, anomenada de modelatge tou (soft-modelling), es basa en la identificació d'un model de complexació a partir de l'anàlisi numèrica i estadística de les dades, sense cap assumpció prèvia d'un model. Aquesta aproximació, que ha estat extensivament emprada amb dades espectroscòpiques, ho ha estat poquíssim amb dades electroquímiques. En aquest article tractem de la formulació d'un model (hard-modelling) per a la complexació de metalls en sistemes amb mescles de lligands, incloent-hi lligands macromoleculars, i de l'aplicació d
Resumo:
If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.
Resumo:
Transmission electron microscopy is a proven technique in the field of cell biology and a very useful tool in biomedical research. Innovation and improvements in equipment together with the introduction of new technology have allowed us to improve our knowledge of biological tissues, to visualizestructures better and both to identify and to locate molecules. Of all the types ofmicroscopy exploited to date, electron microscopy is the one with the mostadvantageous resolution limit and therefore it is a very efficient technique fordeciphering the cell architecture and relating it to function. This chapter aims toprovide an overview of the most important techniques that we can apply to abiological sample, tissue or cells, to observe it with an electron microscope, fromthe most conventional to the latest generation. Processes and concepts aredefined, and the advantages and disadvantages of each technique are assessedalong with the image and information that we can obtain by using each one ofthem.
Resumo:
This Handbook contains a collection of articles describing instrumental techniques used for Materials, Chemical and Biosciences research that are available at the Scientific and Technological Centers of theUniversity of Barcelona (CCiTUB). The CCiTUB are a group of facilities of the UB that provide both the research community and industry with ready access to a wide range of major instrumentation.Together with the latest equipment and technology, the CCiTUB provide expertise in addressing the methodological research needs of the user community and they also collaborate in R+D+i Projectswith industry. CCiTUB specialists include technical and Ph.D.-level professional staff members who are actively engaged in methodological research. Detailed information on the centers’ resources andactivities can be found at the CCiTUB website www.ccit.ub.edu ...
Resumo:
This article summarizes the basic principles of light microscopy, with examples of applications in biomedicine that illustrate the capabilities of thetechnique.