93 resultados para Precaution Adoption Process
Resumo:
Collage is a pattern-based visual design authoring tool for the creation of collaborative learning scripts computationally modelled with IMS Learning Design (LD). The pattern-based visual approach aims to provide teachers with design ideas that are based on broadly accepted practices. Besides, it seeks hiding the LD notation so that teachers can easily create their own designs. The use of visual representations supports both the understanding of the design ideas and the usability of the authoring tool. This paper presents a multicase study comprising three different cases that evaluate the approach from different perspectives. The first case includes workshops where teachers use Collage. A second case implies the design of a scenario proposed by a third-party using related approaches. The third case analyzes a situation where students follow a design created with Collage. The cross-case analysis provides a global understanding of the possibilities and limitations of the pattern-based visual design approach.
Resumo:
We study the minimum mean square error (MMSE) and the multiuser efficiency η of large dynamic multiple access communication systems in which optimal multiuser detection is performed at the receiver as the number and the identities of active users is allowed to change at each transmission time. The system dynamics are ruled by a Markov model describing the evolution of the channel occupancy and a large-system analysis is performed when the number of observations grow large. Starting on the equivalent scalar channel and the fixed-point equation tying multiuser efficiency and MMSE, we extend it to the case of a dynamic channel, and derive lower and upper bounds for the MMSE (and, thus, for η as well) holding true in the limit of large signal–to–noise ratios and increasingly large observation time T.
Resumo:
This work briefly analyses the difficulties to adopt the Semantic Web, and in particular proposes systems to know the present level of migration to the different technologies that make up the Semantic Web. It focuses on the presentation and description of two tools, DigiDocSpider and DigiDocMetaEdit, designed with the aim of verifYing, evaluating, and promoting its implementation.
Resumo:
Economists and economic historians want to know how much better life is today than in the past.Fifty years ago economic historians found surprisingly small gains from 19th century US railroads,while more recently economists have found relatively large gains from electricity, computers and cellphones. In each case the implicit or explicit assumption is that researchers were measuring the valueof a new good to society. In this paper we use the same techniques to find the value to society ofmaking existing goods cheaper. Henry Ford did not invent the car, and the inventors of mechanisedcotton spinning in the industrial revolution invented no new product. But both made existing productsdramatically cheaper, bringing them into the reach of many more consumers. That in turn haspotentially large welfare effects. We find that the consumer surplus of Henry Ford s production linewas around 2% by 1923, 15 years after Ford began to implement the moving assembly line, while themechanisation of cotton spinning was worth around 6% by 1820, 34 years after its initial invention.Both are large: of the same order of magnitude as consumer expenditure on these items, and as largeor larger than the value of the internet to consumers. On the social savings measure traditionally usedby economic historians, these process innovations were worth 15% and 18% respectively, makingthem more important than railroads. Our results remind us that process innovations can be at least asimportant for welfare and productivity as the invention of new products.
Resumo:
We study a dynamic general equilibrium model where innovation takes theform of the introduction of new goods whose production requires skilled workers.Innovation is followed by a costly process of standardization, whereby these newgoods are adapted to be produced using unskilled labor. Our framework highlightsa number of novel results. First, standardization is both an engine of growth anda potential barrier to it. As a result, growth is an inverse U-shaped function ofthe standardization rate (and of competition). Second, we characterize the growthand welfare maximizing speed of standardization. We show how optimal protection of intellectual property rights affecting the cost of standardization vary withthe skill-endowment, the elasticity of substitution between goods and other parameters. Third, we show that, depending on how competition between innovatingand standardizing firms is modelled and on parameter values, a new type of multiplicity of equilibria may arise. Finally, we study the implications of our model forthe skill-premium and we illustrate novel reasons for linking North-South trade tointellectual property rights protection.
Resumo:
In this paper we argue that inventory models are probably not usefulmodels of household money demand because the majority of households does nothold any interest bearing assets. The relevant decision for most people is notthe fraction of assets to be held in interest bearing form, but whether to holdany of such assets at all. The implications of this realization are interesting and important. We find that(a) the elasticity of money demand is very small when the interest rate is small,(b) the probability that a household holds any amount of interest bearing assetsis positively related to the level of financial assets, and (c) the cost ofadopting financial technologies is positively related to age and negatively relatedto the level of education. Unlike the traditional methods of money demand estimation, our methodology allowsfor the estimation of the interest--elasticity at low values of the nominalinterest rate. The finding that the elasticity is very small for interest ratesbelow 5 percent suggests that the welfare costs of inflation are small. At interest rates of 6 percent, the elasticity is close to 0.5. We find thatroughly one half of this elasticity can be attributed to the Baumol--Tobin orintensive margin and half of it can be attributed to the new adopters or extensivemargin. The intensive margin is less important at lower interest rates and moreimportant at higher interest rates.
Resumo:
Human beings increase their productivity by specializingtheir resources and exchanging their products. Theorganization of exchange is costly, however, becausespecialized activities need coordination and incentiveshave to be aligned. This work first describes how theseexchanges are organized in an institutional environment.It then focuses on the dual effect of this environment-as with any other specialized resource, institutions maybe used for expropriation purposes. They enjoyspecialization advantages in safeguarding exchange butthey also make possible new forms of opportunism,causing new costs of exchange. Three perverse tendenciesare identified:In the legal field, there is a surplus ofmandatory rules and, at the same time, a deficit in default rules. Second, courts activity is biased againstthe quasi-judicial role of the parties and the market. Third, Market enforcement is based on reputationalassets that are badly exposed to opportunism.
Resumo:
This paper argues that any specific utility or disutility for gamblingmust be excluded from expected utility because such a theory is consequentialwhile a pleasure or displeasure for gambling is a matter of process, notof consequences. A (dis)utility for gambling is modeled as a process utilitywhich monotonically combines with expected utility restricted to consequences.This allows for a process (dis)utility for gambling to be revealed. Asan illustration, the model shows how empirical observations in the Allaisparadox can reveal a process disutility of gambling. A more general modelof rational behavior combining processes and consequences is then proposedand discussed.
Resumo:
The examinations taken by high-school graduates in Spain and the role ofthe examination in the university admissions process are described. Thefollowing issues arising in the assessment of the process are discussed:reliability of grading, comparability of the grades and scores(equating),maintenance of standards, and compilation and use of the grading process,and their integration in the operational grading are proposed. Variousschemes for score adjustment are reviewed and feasibility of theirimplementation discussed. The advantages of pretesting of items and ofempirical checks of experts' judgements are pointed out. The paperconcludes with an outline of a planned reorganisation of the highereducation in Spain, and with a call for a comprehensive programme ofempirical research concurrent with the operation of the examination andscoring system.
Resumo:
Accomplish high quality of final products in pharmaceutical industry is a challenge that requires the control and supervision of all the manufacturing steps. This request created the necessity of developing fast and accurate analytical methods. Near infrared spectroscopy together with chemometrics, fulfill this growing demand. The high speed providing relevant information and the versatility of its application to different types of samples lead these combined techniques as one of the most appropriated. This study is focused on the development of a calibration model able to determine amounts of API from industrial granulates using NIR, chemometrics and process spectra methodology.
Resumo:
El presente documento ilustra la aplicación de la metodología Business Process Managementpara el caso de una empresa multinacional del sector de la electrónica. Para ello se han tomado los procesos excepcionales de Supply Chain Operations en el área EMEA (Europa, Oriente Medio y África). Se ha analizado la situación inicial, donde la aparición de incidencias de calidad en productos terminados y listos para entregar a clientes generaba una serie de acciones descoordinadas y con resultados insatisfactorios. Todos los departamentos implicados comprometían recursos, tiempo y esfuerzo, sin estar alineados entre sí. A partir de la aplicación sistemática de la metodología BPM definida en 10 fases, se ha desarrollado una solución completa para los procesos excepcionales. El documento describe con detalle en proceso de Reflash y la documentación necesaria para poner el proceso bajo control y en mejora continua.
Resumo:
Proves de conversió de fòrmules matemàtiques des d'editors de text ofimàtics i des de Làtex. Visionat en HTML i MathML. El millor resultat s'aconsegueix amb MSWord+MathType i IE+MathPlayer.