880 resultados para Forecasting and replenishment (CPFR)
Resumo:
Unternehmen konkurrieren in einem globalen Wettbewerb um den Transfer neuer Technologien in erfolgreiche Geschäftsmodelle. Aus diesem Grund stehen sie zunehmend der Herausforderung gegenüber, technologische Potenziale frühzeitig zu identifizieren, zu bewerten und Strategien für das Erschließen dieser Potenziale zu entwickeln. Dies ist zentraler Gegenstand der Vorausschau und Planung neuer Technologiepfade. In der vorliegenden Arbeit wird gemeinsam mit vier Unternehmen ein Leitfaden für die Strategiefindung, Entwicklung und Kommerzialisierung neu aufkommender Technologien entwickelt und angewendet. Den Ausgangspunkt der Arbeit bildet eine systematische Aufarbeitung des Forschungsstandes der Vorausschau und Planung neuer Technologien. Anschließend wird ein Beschreibungsmodell der Entstehung neuer Technologiepfade in technologiebasierten Innovationssystemen entwickelt. Auf Basis dieses Modells werden unterschiedliche Kategorien von Einflussfaktoren definiert, die als Analyserahmen für die neu entstehende Technologie dienen. Auf Basis der in der Literatur dokumentierten Abläufe, Teamstrukturen und Methoden (z.B. Roadmaps, Szenarien, Datenbankanalysen) wird ein sechsstufiger Ansatz für die Durchführung der Vorausschau und Planung neuer Technologiepfade konzipiert. Dieser Ansatz wird in vier Firmen für die Vorausschau und Planung neuer Technologien angewendet. Die untersuchten Technologien lassen sich den Feldern Biotechnologie, Nanotechnologie, Umwelttechnologie und Sensorik zuordnen. Zentrales Ergebnis der Arbeit ist ein entsprechend der Erfahrungen in den Unternehmen angepasster Ansatz für die Vorausschau und Planung neuer Technologiepfade. Dieser Ansatz ist in Abhängigkeit von Unternehmens- und Technologiecharakteristika für die weitere Anwendung konkretisiert. Dabei finden die zu beteiligenden Organisationseinheiten, zu betrachtende Einflussfaktoren sowie anwendbare Methoden besondere Berücksichtigung. Die Arbeit richtet sich an Personen in Führungspositionen im Bereich des strategischen Technologiemanagements sowie der Forschung und Entwicklung in Unternehmen, die Strategien für neu aufkommende Technologien entwickeln. Weiterhin sind die Ergebnisse der Arbeit für Wissenschaftler auf dem Gebiet der Methoden zur Vorausschau und Strategieentwicklung für neue Technologien von Interesse.
Resumo:
Este libro está dirigido a un público amplio científicos, humanistas ingenieros, y en general a quien se aventura a pensar y a trabajar en términos no disciplinares. Con toda seguridad, el mérito del trabajo en ciencias de la complejidad consiste en indisciplinar a las ciencias, una expresión ya en boga, y que de un lado apunta hacia la idea de una "tercera cultura (Brockman), tanto como hacia una nueva alianza (I. Prigogine). Existe, a todas luces, una masa crítica que trabaja en complejidad y que está interesada, en profundidad y con total seriedad, en estos temas. La prueba son los diversos eventos académicos y científicos, las compilaciones y las publicaciones, cada vez con calidad más eximia y de amplio cubrimiento inter y transdisciplinar. Este libro quiere contribuir a esta historia y a estos procesos. Existe, a todas luces, una masa crítica que trabaja en complejidad y que está interesada, en profundidad y con total seriedad, en estos temas. La prueba son los diversos eventos académicos y científicos, las compilaciones y las publicaciones, cada vez con calidad más eximia y de amplio cubrimiento inter y transdisciplinar. Este libro quiere contribuir a esta historia y a estos procesos.
Resumo:
Contenido Introducción 1. Inteligencia emocional, liderazgo transformacional y género: factores que influencian el desempeño organizacional / Ana María Galindo Londoño, Sara Urrego Mayorga; Director: Juan Carlos Espinosa Méndez. 2. El rol de la mujer en el liderazgo / Andrea Patricia Cuestas Díaz; Directora: Francoise Venezia Contreras Torres. 3. Liderazgo transformacional, clima organizacional, satisfacción laboral y desempeño. Una revisión de la literatura / Juliana Restrepo Orozco, Ángela Marcela Ochoa Rodríguez; Directora: Françoise Venezia Contreras Torres. 4. “E-Leadership” una perspectiva al mundo de las compañías globalizadas / Ángela Beatriz Morales Morales, Mónica Natalia Aguilera Velandia; Director: Juan Carlos Espinosa. 5. Liderazgo y cultura. Una revisión / Daniel Alejandro Romero Galindo; Directora: Francoise Venezia Contreras Torres. 6. La investigación sobre la naturaleza del trabajo directivo: una revisión de la literatura / Julián Felipe Rodríguez Rivera, María Isabel Álvarez Rodríguez; Director: Juan Javier Saavedra Mayorga. 7. La mujer en la alta dirección en el contexto colombiano / Ana María Moreno, Juliana Moreno Jaramillo ; Directora: Françoise Venezia Contreras Torres. 8. Influencia de la personalidad en el discurso y liderazgo de George W. Bush después del 11 de septiembre de 2011 / Karen Eliana Mesa Torres; Director: Juan Carlos Espinosa. 9. La investigación sobre el campo del followership: una revisión de la literatura / Christian D. Báez Millán, Leidy J. Pinzón Porras; Director: Juan Javier Saavedra Mayorga. 10. El liderazgo desde la perspectiva del poder y la influencia. Una revisión de la literatura / Lina María García, Juan Sebastián Naranjo; Director: Juan Javier Saavedra Mayorga. 11. El trabajo directivo para líderes y gerentes: una visión integradora de los roles organizacionales / Lina Marcela Escobar Campos, Daniel Mora Barrero; Director: Rafael Piñeros. 12. Participación emocional en la toma de decisiones / Lina Rocío Poveda C., Gloria Johanna Rueda L.; Directora: Francoise Contreras T. 13. Estrés y su relación con el liderazgo / María Camila García Sierra, Diana Paola Rocha Cárdenas; Director: Juan Carlos Espinosa. 14. “Burnout y engagement” / María Paola Jaramillo Barrios, Natalia Rojas Mancipe; Director: Rafael Piñeros.
Resumo:
The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an “inner” direct or iterative process. In comparison with Newton’s method and its variants, the algorithm is attractive because it does not require the evaluation of second-order derivatives in the Hessian of the objective function. In practice the exact Gauss–Newton method is too expensive to apply operationally in meteorological forecasting, and various approximations are made in order to reduce computational costs and to solve the problems in real time. Here we investigate the effects on the convergence of the Gauss–Newton method of two types of approximation used commonly in data assimilation. First, we examine “truncated” Gauss–Newton methods where the inner linear least squares problem is not solved exactly, and second, we examine “perturbed” Gauss–Newton methods where the true linearized inner problem is approximated by a simplified, or perturbed, linear least squares problem. We give conditions ensuring that the truncated and perturbed Gauss–Newton methods converge and also derive rates of convergence for the iterations. The results are illustrated by a simple numerical example. A practical application to the problem of data assimilation in a typical meteorological system is presented.
Resumo:
Data assimilation – the set of techniques whereby information from observing systems and models is combined optimally – is rapidly becoming prominent in endeavours to exploit Earth Observation for Earth sciences, including climate prediction. This paper explains the broad principles of data assimilation, outlining different approaches (optimal interpolation, three-dimensional and four-dimensional variational methods, the Kalman Filter), together with the approximations that are often necessary to make them practicable. After pointing out a variety of benefits of data assimilation, the paper then outlines some practical applications of the exploitation of Earth Observation by data assimilation in the areas of operational oceanography, chemical weather forecasting and carbon cycle modelling. Finally, some challenges for the future are noted.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
A new field of study, “decadal prediction,” is emerging in climate science. Decadal prediction lies between seasonal/interannual forecasting and longer-term climate change projections, and focuses on time-evolving regional climate conditions over the next 10–30 yr. Numerous assessments of climate information user needs have identified this time scale as being important to infrastructure planners, water resource managers, and many others. It is central to the information portfolio required to adapt effectively to and through climatic changes. At least three factors influence time-evolving regional climate at the decadal time scale: 1) climate change commitment (further warming as the coupled climate system comes into adjustment with increases of greenhouse gases that have already occurred), 2) external forcing, particularly from future increases of greenhouse gases and recovery of the ozone hole, and 3) internally generated variability. Some decadal prediction skill has been demonstrated to arise from the first two of these factors, and there is evidence that initialized coupled climate models can capture mechanisms of internally generated decadal climate variations, thus increasing predictive skill globally and particularly regionally. Several methods have been proposed for initializing global coupled climate models for decadal predictions, all of which involve global time-evolving three-dimensional ocean data, including temperature and salinity. An experimental framework to address decadal predictability/prediction is described in this paper and has been incorporated into the coordinated Coupled Model Intercomparison Model, phase 5 (CMIP5) experiments, some of which will be assessed for the IPCC Fifth Assessment Report (AR5). These experiments will likely guide work in this emerging field over the next 5 yr.
Resumo:
The level of insolvencies in the construction industry is high, when compared to other industry sectors. Given the management expertise and experience that is available to the construction industry, it seems strange that, according to the literature, the major causes of failure are lack of financial control and poor management. This indicates that with a good cash flow management, companies could be kept operating and financially healthy. It is possible to prevent failure. Although there are financial models that can be used to predict failure, they are based on company accounts, which have been shown to be an unreliable source of data. There are models available for cash flow management and forecasting and these could be used as a starting point for managers in rethinking their cash flow management practices. The research reported here has reached the stage of formulating researchable questions for an in-depth study including issues such as how contractors manage their cash flow, how payment practices can be managed without damaging others in the supply chain and the relationships between companies" financial structures and the payment regimes to which they are subjected.
Resumo:
The level of insolvencies in the construction industry is high, when compared to other industry sectors. Given the management expertise and experience that is available to the construction industry, it seems strange that, according to the literature, the major causes of failure are lack of financial control and poor management. This indicates that with a good cash flow management, companies could be kept operating and financially healthy. It is possible to prevent failure. Although there are financial models that can be used to predict failure, they are based on company accounts, which have been shown to be an unreliable source of data. There are models available for cash flow management and forecasting and these could be used as a starting point for managers in rethinking their cash flow management practices. The research reported here has reached the stage of formulating researchable questions for an in-depth study including issues such as how contractors manage their cash flow, how payment practices can be managed without damaging others in the supply chain and the relationships between companies’ financial structures and the payment regimes to which they are subjected.
Resumo:
The problem of modeling solar energetic particle (SEP) events is important to both space weather research and forecasting, and yet it has seen relatively little progress. Most important SEP events are associated with coronal mass ejections (CMEs) that drive coronal and interplanetary shocks. These shocks can continuously produce accelerated particles from the ambient medium to well beyond 1 AU. This paper describes an effort to model real SEP events using a Center for Integrated Space weather Modeling (CISM) MHD solar wind simulation including a cone model of CMEs to initiate the related shocks. In addition to providing observation-inspired shock geometry and characteristics, this MHD simulation describes the time-dependent observer field line connections to the shock source. As a first approximation, we assume a shock jump-parameterized source strength and spectrum, and that scatter-free transport occurs outside of the shock source, thus emphasizing the role the shock evolution plays in determining the modeled SEP event profile. Three halo CME events on May 12, 1997, November 4, 1997 and December 13, 2006 are used to test the modeling approach. While challenges arise in the identification and characterization of the shocks in the MHD model results, this approach illustrates the importance to SEP event modeling of globally simulating the underlying heliospheric event. The results also suggest the potential utility of such a model for forcasting and for interpretation of separated multipoint measurements such as those expected from the STEREO mission.
Resumo:
The effects of the 2003 European heat wave have highlighted the need for society to prepare itself for and cope more effectively with heat waves. This is particularly important in the context of predicted climate change and the likelihood of more frequent extreme climate events; to date, heat as a natural hazard has been largely ignored. In order to develop better coping strategies, this report explores the factors that shape the social impacts of heat waves, and sets out a programme of research to address the considerable knowledge gaps in this area. Heat waves, or periods of anomalous warmth, do not affect everyone; it is the vulnerable individuals or sectors of society who will most experience their effects. The main factors of vulnerability are being elderly, living alone, having a pre-existing disease, being immobile or suffering from mental illness and being economically disadvantaged. The synergistic effects of such factors may prove fatal for some. Heat waves have discernible impacts on society including a rise in mortality, an increased strain on infrastructure (power, water and transport) and a possible rise in social disturbance. Wider impacts may include effects on the retail industry, ecosystem services and tourism. Adapting to more frequent heat waves should include soft engineering options and, where possible, avoid the widespread use of air conditioning which could prove unsustainable in energy terms. Strategies for coping with heat include changing the way in which urban areas are developed or re-developed, and setting up heat watch warning systems based around weather and seasonal climate forecasting and intervention strategies. Although heat waves have discernible effects on society, much remains unknown about their wider social impacts, diffuse health issues and how to manage them.
Resumo:
Temperature is one of the most prominent environmental factors that determine plant growth, devel- opment, and yield. Cool and moist conditions are most favorable for wheat. Wheat is likely to be highly vulnerable to further warming because currently the temperature is already close to or above optimum. In this study, the impacts of warming and extreme high temperature stress on wheat yield over China were investigated by using the general large area model (GLAM) for annual crops. The results showed that each 1±C rise in daily mean temperature would reduce the average wheat yield in China by about 4.6%{5.7% mainly due to the shorter growth duration, except for a small increase in yield at some grid cells. When the maximum temperature exceeded 30.5±C, the simulated grain-set fraction declined from 1 at 30.5±C to close to 0 at about 36±C. When the total grain-set was lower than the critical fractional grain-set (0.575{0.6), harvest index and potential grain yield were reduced. In order to reduce the negative impacts of warming, it is crucial to take serious actions to adapt to the climate change, for example, by shifting sowing date, adjusting crop distribution and structure, breeding heat-resistant varieties, and improving the monitoring, forecasting, and early warning of extreme climate events.
Resumo:
Early and effective flood warning is essential to initiate timely measures to reduce loss of life and economic damage. The availability of several global ensemble weather prediction systems through the “THORPEX Interactive Grand Global Ensemble” (TIGGE) archive provides an opportunity to explore new dimensions in early flood forecasting and warning. TIGGE data has been used as meteorological input to the European Flood Alert System (EFAS) for a case study of a flood event in Romania in October 2007. Results illustrate that awareness for this case of flooding could have been raised as early as 8 days before the event and how the subsequent forecasts provide increasing insight into the range of possible flood conditions. This first assessment of one flood event illustrates the potential value of the TIGGE archive and the grand-ensembles approach to raise preparedness and thus to reduce the socio-economic impact of floods.
Resumo:
A global river routing scheme coupled to the ECMWF land surface model is implemented and tested within the framework of the Global Soil Wetness Project II, to evaluate the feasibility of modelling global river runoff at a daily time scale. The exercise is designed to provide benchmark river runoff predictions needed to verify the land surface model. Ten years of daily runoff produced by the HTESSEL land surface scheme is input into the TRIP2 river routing scheme in order to generate daily river runoff. These are then compared to river runoff observations from the Global Runoff Data Centre (GRDC) in order to evaluate the potential and the limitations. A notable source of inaccuracy is bias between observed and modelled discharges which is not primarily due to the modelling system but instead of to the forcing and quality of observations and seems uncorrelated to the river catchment size. A global sensitivity analysis and Generalised Likelihood Uncertainty Estimation (GLUE) uncertainty analysis are applied to the global routing model. The ground water delay parameter is identified as being the most sensitive calibration parameter. Significant uncertainties are found in results, and those due to parameterisation of the routing model are quantified. The difficulty involved in parameterising global river discharge models is discussed. Detailed river runoff simulations are shown for the river Danube, which match well observed river runoff in upstream river transects. Results show that although there are errors in runoff predictions, model results are encouraging and certainly indicative of useful runoff predictions, particularly for the purpose of verifying the land surface scheme hydrologicly. Potential of this modelling system on future applications such as river runoff forecasting and climate impact studies is highlighted. Copyright © 2009 Royal Meteorological Society.