43 resultados para forecast
Resumo:
“Un reciente estudio de la consultora Nielsen estima en 49.650 millones de € las pérdidas anuales procedentes de la inversión en publicidad no efectiva en el mundo”.Esta noticia refleja un hecho que causa alarmismo en la sociedad. Semejante gasto desbaratado en llevar a cabo un proyecto que no causa ningún beneficio es motivo de preocupación entre economistas y profesionales del Marketing. Ciertamente, entenderíamos que multitud de personas se llevasen las manos a la cabeza ante la evidencia de semejante despilfarro. Llegados a este punto, hay que comunicar dos hechos. En primer lugar, hemos de aclarar que este es un titular ficticio y los datos que se aportan son irreales.Lamentablemente, en segundo lugar se ha de exponer que las cifras estimadas reales duplican –siendo optimistas‐ las anteriormente citadas (1 “Advertising expenditure forecast 2008” ZenithOptimedia – Nielsen Facts 2008).Actualmente los expertos en Marketing están en medio de un proceso de búsqueda de nuevas fórmulas que aumenten la eficacia y reduzcan el gasto de sus campañas publicitarias, y ése es el terreno en donde se mueve el Marketing viral, fenómeno en plena expansión gracias a la creciente importancia de Internet en nuestras vidas.Este hecho fue lo que nos ha empujado a investigar sobre la disciplina y sus métodos.Descubrir qué se ocultaba detrás de la categoría “viral” y darnos cuenta de su imparable desarrollo, descubriendo cómo habíamos sido a la vez verdugos y mártires en la propagación de mensajes publicitarios.En nuestro trabajo queremos darle un nuevo sentido al concepto de ahorro en el Marketing viral, intentado descubrir si es posible llevar a cabo una transformación low‐cost del concepto de Marketing viral y aplicarla con éxito a un sector de lapoblación en el que podamos controlar los efectos de una campaña de dichas características, por lo cual escogimos a los estudiantes del campus de Ciutadella de la Universidad Pompeu Fabra como target de nuestra campaña.Además, nos planteamos analizar el ahorro de nuestra campaña mediante el análisis y comparación coste‐resultado de otras vías de Marketing tradicionales con un concepto novedoso y a la vez preciso como es el de “eficacia real publicitaria”, en el cual nos servimos de estudios sociopsicológicos de consumidores para establecer unosbaremos más cercanos a la realidad que los métodos más comunes de medición de resultados publicitarios.Para ello, es necesario crear una identidad completamente nueva. Es aquí donde surge la marca, franquicia, empresa, organización y filosofía –ficticias‐ Kimbi. Y el reto exige un esfuerzo notable: dar a conocer una identidad –comercial y empresarial‐ ficticia que no ofrece ningún servicio ni producto a un sector poblacional y crear expectativas en elobjetivo, llamar su atención, grabar en sus mentes nuestros identificativos y esperar que el virus del “movimiento Kimbi” se propague entre ellos con éxito. Es decir, competir en el saturado panorama publicitario contra multitud de multinacionales que ya tienen unos usuarios fieles y una reputación labrada, que ofrecen productos y servicios de manera gratuita en bastantes ocasiones y que disponen de ingentes cantidades de recursos y capital para publicitar su identidad comercial en multitud de medios mainstream con el objetivo de atraerles y causarles un impacto publicitario. Esperamos haber conseguido, al menos, que el lector sienta el deseo de ver el trabajo. ¿Queréis descubrir el resultado?
Resumo:
This paper combines multivariate density forecasts of output growth, inflationand interest rates from a suite of models. An out-of-sample weighting scheme based onthe predictive likelihood as proposed by Eklund and Karlsson (2005) and Andersson andKarlsson (2007) is used to combine the models. Three classes of models are considered: aBayesian vector autoregression (BVAR), a factor-augmented vector autoregression (FAVAR)and a medium-scale dynamic stochastic general equilibrium (DSGE) model. Using Australiandata, we find that, at short forecast horizons, the Bayesian VAR model is assignedthe most weight, while at intermediate and longer horizons the factor model is preferred.The DSGE model is assigned little weight at all horizons, a result that can be attributedto the DSGE model producing density forecasts that are very wide when compared withthe actual distribution of observations. While a density forecast evaluation exercise revealslittle formal evidence that the optimally combined densities are superior to those from thebest-performing individual model, or a simple equal-weighting scheme, this may be a resultof the short sample available.
Resumo:
Purpose - There has been much research on manufacturing flexibility, but supply chain flexibility is still an under-investigated area. This paper focuses on supply flexibility, the aspects of flexibility related to the upstream supply chain. Our purpose is to investigate why and how firms increase supply flexibility.Methodology/Approach An exploratory multiple case study was conducted. We analyzed seven Spanish manufacturers from different sectors (automotive, apparel, electronics and electrical equipment).Findings - The results show that there are some major reasons why firms need supply flexibility (manufacturing schedule fluctuations, JIT purchasing, manufacturing slack capacity, low level of parts commonality, demand volatility, demand seasonality and forecast accuracy), and that companies increase this type of flexibility by implementing two main strategies: to increase suppliers responsiveness capability and flexible sourcing . The results also suggest that the supply flexibility strategy selected depends on two factors: the supplier searching and switching costs and the type of uncertainty (mix, volume or delivery).Research limitations - This paper has some limitations common to all case studies, such as the subjectivity of the analysis, and the questionable generalizability of results (since the sample of firms is not statistically significant).Implications - Our study contributes to the existing literature by empirically investigating which are the main reasons for companies needing to increase supply flexibility, how they increase this flexibility, and suggesting some factors that could influence the selection of a particular supply flexibility strategy.
Resumo:
Any electoral system has an electoral formula that converts voteproportions into parliamentary seats. Pre-electoral polls usually focuson estimating vote proportions and then applying the electoral formulato give a forecast of the parliament's composition. We here describe theproblems arising from this approach: there is always a bias in theforecast. We study the origin of the bias and some methods to evaluateand to reduce it. We propose some rules to compute the sample sizerequired for a given forecast accuracy. We show by Monte Carlo simulationthe performance of the proposed methods using data from Spanish electionsin last years. We also propose graphical methods to visualize how electoralformulae and parliamentary forecasts work (or fail).
Resumo:
We propose a new family of density functions that possess both flexibilityand closed form expressions for moments and anti-derivatives, makingthem particularly appealing for applications. We illustrate its usefulnessby applying our new family to obtain density forecasts of U.S. inflation.Our methods generate forecasts that improve on standard methods based on AR-ARCH models relying on normal or Student's t-distributional assumptions.
Resumo:
In liberalized electricity markets, generation Companies must build an hourly bidthat is sent to the market operator. The price at which the energy will be paid is unknown during the bidding process and has to be forecast. In this work we apply forecasting factor models to this framework and study its suitability.
Resumo:
L’augment de la utilització de les noves tecnologies a la nostra societat permet a les empreses arribar al client d’una forma més rapida i facilitant la informació de manera àgil i ordenada. Amb aquest objectiu s’ha creat una botiga virtual que serà la part visible als usuaris i clients de l’empresa PRINTONER S.L , dedicada al sector dels consumibles, especialment els reciclats. Per l’empresa un dels objectius principals és oferir al client la possibilitat de comprar els seus productes de manera còmode a través d’Internet, ja que accedint amb un nom d’usuari i una contrasenya podrà obtenir totes les referències de les que es disposa, podrà tramitar les comandes i controlar-ne l’estat fins el moment de l’entrega. A part de les seccions destinades a usuaris i clients s’ha creat una zona d’administració, on els responsables de l’empresa podran gestionar tots els productes, modificar i visualitzar les comandes. A més aprofitant que aquestes quedaran guardades a una base de dades juntament amb els productes venuts, s’integrarà el sistema de facturació de l’empresa, cosa que fins el moment es feia de manera manual i maldestre. També es programarà una part on els responsables podran insertar reparacions i vendes informàtiques que s’hagin de facturar o per fer-ne un us estadístic en un futur. Tot això ens portarà a implementar un sistema d’usuaris registrats amb diferents permisos i diferents nivells d’accés a l’aplicació, fins a un total de 5. S’ha intentat fer de l’aplicació, un sistema a mida i que compleixi tots els requisits que l’empresa ens ha demanat, amb la previsió que més endavant s’hi pugui implementar un sistema de gestió d’estocs i altres millores per oferir als seus clients un servei inigualable. Per tal de portar a terme tot aquest treball s’ha utilitzat una tecnologia de lliure distribució com és el llenguatge PHP i la base de dades MySQL, aquesta opció a part d’una filosofia es produeix per intentar minimitzar els costos de l’aplicatiu. La finalitat de l’empresa amb aquest projecte és oferir millor imatge i servei, efectivitat i rapidesa en tot el procés de vendes, així com reduir costos de facturació i també de publicitat, ja que es podrà potenciar molt més la pàgina web via internet.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
The right of a person to be protected from natural hazards is a characteristic of the social and economical development of the society. This paper is a contribution to the reflection about the role of Civil Protection organizations in a modern society. The paper is based in the inaugural conference made by the authors on the 9th Plinius Conference on Mediterranean Storms. Two major issues are considered. The first one is sociological; the Civil Protection organizations and the responsible administration of the land use planning should be perceived as reliable as possible, in order to get consensus on the restrictions they pose, temporary or definitely, on the individual free use of the territory as well as in the entire warning system. The second one is technological: in order to be reliable they have to issue timely alert and warning to the population at large, but such alarms should be as "true" as possible. With this aim, the paper summarizes the historical evolution of the risk assessment, starting from the original concept of "hazard", introducing the concepts of "scenario of event" and "scenario of risk" and ending with a discussion about the uncertainties and limits of the most advanced and efficient tools to predict, to forecast and to observe the ground effects affecting people and their properties. The discussion is centred in the case of heavy rains and flood events in the North-West of Mediterranean Region.
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
A broad class of dark energy models, which have been proposed in attempts at solving the cosmological constant problems, predict a late time variation of the equation of state with redshift. The variation occurs as a scalar field picks up speed on its way to negative values of the potential. The negative potential energy eventually turns the expansion into contraction and the local universe undergoes a big crunch. In this paper we show that cross-correlations of the cosmic microwave background anisotropy and matter distribution, in combination with other cosmological data, can be used to forecast the imminence of such cosmic doomsday.
Resumo:
The present work deals with quantifying group characteristics. Specifically, dyadic measures of interpersonal perceptions were used to forecast group performance. 46 groups of students, 24 of four and 22 of five people, were studied in a real educational assignment context and marks were gathered as an indicator of group performance. Our results show that dyadic measures of interpersonal perceptions account for final marks. By means of linear regression analysis 85% and 85.6% of group performance was respectively explained for group sizes equal to four and five. Results found in the scientific literature based on the individualistic approach are no larger than 18%. The results of the present study support the utility of dyadic approaches for predicting group performance in social contexts.
Resumo:
We propose new methods for evaluating predictive densities. The methods includeKolmogorov-Smirnov and Cram?r-von Mises-type tests for the correct specification ofpredictive densities robust to dynamic mis-specification. The novelty is that the testscan detect mis-specification in the predictive densities even if it appears only overa fraction of the sample, due to the presence of instabilities. Our results indicatethat our tests are well sized and have good power in detecting mis-specification inpredictive densities, even when it is time-varying. An application to density forecastsof the Survey of Professional Forecasters demonstrates the usefulness of the proposedmethodologies.
Resumo:
The main goal of this article is to provide an answer to the question: "Does anything forecast exchange rates, and if so, which variables?". It is well known thatexchange rate fluctuations are very difficult to predict using economic models, andthat a random walk forecasts exchange rates better than any economic model (theMeese and Rogoff puzzle). However, the recent literature has identified a series of fundamentals/methodologies that claim to have resolved the puzzle. This article providesa critical review of the recent literature on exchange rate forecasting and illustratesthe new methodologies and fundamentals that have been recently proposed in an up-to-date, thorough empirical analysis. Overall, our analysis of the literature and thedata suggests that the answer to the question: "Are exchange rates predictable?" is,"It depends" -on the choice of predictor, forecast horizon, sample period, model, andforecast evaluation method. Predictability is most apparent when one or more of thefollowing hold: the predictors are Taylor rule or net foreign assets, the model is linear, and a small number of parameters are estimated. The toughest benchmark is therandom walk without drift.
Resumo:
La libertad condicional es una institución cuya aplicación no se da con la frecuencia que debería para lograr una resocialización y reinserción adecuada. Para reducir las tasas de encarcelamiento y los costes que se derivan, así como para igualar las tasas de liberados condicionales en Cataluña con las del resto del Estado, se proponen posibles mejoras para la concesión del último grado penitenciario. Las propuestas se desarrollan a partir de una investigación empírica basada en una revisión exhaustiva de los informes de pronóstico de reinserción de la Junta de Tratamiento y las resoluciones del Fiscal de Vigilancia Penitenciaria.Se formularán propuestas generales y específicas. Las primeras estarán encabezadas a modificar el punitivismo de la sociedad y la implementación de la libertad condicional. Las segundas estarán orientadas a focalizar el último grado penitenciario también a internos con alto riesgo de reincidencia, siempre y cuando se les proporcione una intervención intensiva; a mejorar y tratar tanto los factores estáticos como los dinámicos –hábitos laborales, toxicomanías, apoyo familiar– para facilitar el acceso a la libertad condicional según los actuales requisitos; a concienciar sobre la importancia de satisfacer la responsabilidad civil y a seguir la línea del modelo de riesgo, necesidad y responsividad.ABSTRACTParole is an institution whose application does not occur as often as it should to achieve resocialization and reintegration adequately. To reduce incarceration rates and its costs, as well as to equalize Catalonia’s parole rates with the rest of the state, it is suggested possible improvements for the last grade prison’s granting. The proposals were developed from an empirical research based on the analysis of the Treatment Assembly’s forecast reports reintegration and the Fiscal’s resolutions.It will be formulated general and specific proposals. The first one will be led to modify society’s punitivity and parole’s implementation. The second one will be directed on focusing parole in high risk prison inmates, as long as they have an intensive intervention; on improving and treat both static and dynamic factors –work habits, addictions, family support– to facilitate the access on parole under the current requirements; on raising the importance of paying the civil liability and follow the principles of the model of risk, needs and responsivity.