875 resultados para Technological forecasting
Resumo:
In the year 2007 a General Observation Period (GOP) has been performed within the German Priority Program on Quantitative Precipitation Forecasting (PQP). By optimizing the use of existing instrumentation a large data set of in-situ and remote sensing instruments with special focus on water cycle variables was gathered over the full year cycle. The area of interest covered central Europe with increasing focus towards the Black Forest where the Convective and Orographically-induced Precipitation Study (COPS) took place from June to August 2007. Thus the GOP includes a variety of precipitation systems in order to relate the COPS results to a larger spatial scale. For a timely use of the data, forecasts of the numerical weather prediction models COSMO-EU and COSMO-DE of the German Meteorological Service were tailored to match the observations and perform model evaluation in a near real-time environment. The ultimate goal is to identify and distinguish between different kinds of model deficits and to improve process understanding.
The regional distribution of technological development: evidence from foreign-owned firms in Germany
Resumo:
Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.
Resumo:
The relevance of regional policy for less favoured regions (LFRs) reveals itself when policy-makers must reconcile competitiveness with social cohesion through the adaptation of competition or innovation policies. The vast literature in this area generally builds on an overarching concept of ‘social capital’ as the necessary relational infrastructure for collective action diversification and policy integration, in a context much influenced by a dynamic of industrial change and a necessary balance between the creation and diffusion of ‘knowledge’ through learning. This relational infrastructure or ‘social capital’ is centred on people’s willingness to cooperate and ‘envision’ futures as a result of “social organization, such as networks, norms and trust that facilitate action and cooperation for mutual benefit” (Putnam, 1993: 35). Advocates of this interpretation of ‘social capital’ have adopted the ‘new growth’ thinking behind ‘systems of innovation’ and ‘competence building’, arguing that networks have the potential to make both public administration and markets more effective as well as ‘learning’ trajectories more inclusive of the development of society as a whole. This essay aims to better understand the role of ‘social capital’ in the production and reproduction of uneven regional development patterns, and to critically assess the limits of a ‘systems concept’ and an institution-centred approach to comparative studies of regional innovation. These aims are discussed in light of the following two assertions: i) learning behaviour, from an economic point of view, has its determinants, and ii) the positive economic outcomes of ‘social capital’ cannot be taken as a given. It is suggested that an agent-centred approach to comparative research best addresses the ‘learning’ determinants and the consequences of social networks on regional development patterns. A brief discussion of the current debate on innovation surveys has been provided to illustrate this point.
Resumo:
Oxford University Press’s response to technological change in printing and publishing processes in this period can be considered in three phases: an initial period when the computerization of typesetting was seen as offering both cost savings and the ability to produce new editions of existing works more quickly; an intermediate phase when the emergence of standards in desktop computing allowed experiments with the sale of software as well as packaged electronic publications; and a third phase when the availability of the world wide web as a means of distribution allowed OUP to return to publishing in its traditional areas of strength albeit in new formats. Each of these phases demonstrates a tension between a desire to develop centralized systems and expertise, and a recognition that dynamic publishing depends on distributed decision-making and innovation. Alongside these developments in production and distribution lay developments in computer support for managerial and collaborative publishing processes, often involving the same personnel and sometimes the same equipment.
Resumo:
For forecasting and economic analysis many variables are used in logarithms (logs). In time series analysis, this transformation is often considered to stabilize the variance of a series. We investigate under which conditions taking logs is beneficial for forecasting. Forecasts based on the original series are compared to forecasts based on logs. For a range of economic variables, substantial forecasting improvements from taking logs are found if the log transformation actually stabilizes the variance of the underlying series. Using logs can be damaging for the forecast precision if a stable variance is not achieved.
Resumo:
This paper investigates whether using natural logarithms (logs) of price indices for forecasting inflation rates is preferable to employing the original series. Univariate forecasts for annual inflation rates for a number of European countries and the USA based on monthly seasonal consumer price indices are considered. Stochastic seasonality and deterministic seasonality models are used. In many cases, the forecasts based on the original variables result in substantially smaller root mean squared errors than models based on logs. In turn, if forecasts based on logs are superior, the gains are typically small. This outcome sheds doubt on the common practice in the academic literature to forecast inflation rates based on differences of logs.
Resumo:
Recent research has shown that Lighthill–Ford spontaneous gravity wave generation theory, when applied to numerical model data, can help predict areas of clear-air turbulence. It is hypothesized that this is the case because spontaneously generated atmospheric gravity waves may initiate turbulence by locally modifying the stability and wind shear. As an improvement on the original research, this paper describes the creation of an ‘operational’ algorithm (ULTURB) with three modifications to the original method: (1) extending the altitude range for which the method is effective downward to the top of the boundary layer, (2) adding turbulent kinetic energy production from the environment to the locally produced turbulent kinetic energy production, and, (3) transforming turbulent kinetic energy dissipation to eddy dissipation rate, the turbulence metric becoming the worldwide ‘standard’. In a comparison of ULTURB with the original method and with the Graphical Turbulence Guidance second version (GTG2) automated procedure for forecasting mid- and upper-level aircraft turbulence ULTURB performed better for all turbulence intensities. Since ULTURB, unlike GTG2, is founded on a self-consistent dynamical theory, it may offer forecasters better insight into the causes of the clear-air turbulence and may ultimately enhance its predictability.