942 resultados para Financial Times
Resumo:
The estimation of water and solute transit times in catchments is crucial for predicting the response of hydrosystems to external forcings (climatic or anthropogenic). The hydrogeochemical signatures of tracers (either natural or anthropogenic) in streams have been widely used to estimate transit times in catchments as they integrate the various processes at stake. However, most of these tracers are well suited for catchments with mean transit times lower than about 4-5 years. Since the second half of the 20th century, the intensification of agriculture led to a general increase of the nitrogen load in rivers. As nitrate is mainly transported by groundwater in agricultural catchments, this signal can be used to estimate transit times greater than several years, even if nitrate is not a conservative tracer. Conceptual hydrological models can be used to estimate catchment transit times provided their consistency is demonstrated, based on their ability to simulate the stream chemical signatures at various time scales and catchment internal processes such as N storage in groundwater. The objective of this study was to assess if a conceptual lumped model was able to simulate the observed patterns of nitrogen concentration, at various time scales, from seasonal to pluriannual and thus if it was relevant to estimate the nitrogen transit times in headwater catchments. A conceptual lumped model, representing shallow groundwater flow as two parallel linear stores with double porosity, and riparian processes by a constant nitrogen removal function, was applied on two paired agricultural catchments which belong to the Research Observatory ORE AgrHys. The Global Likelihood Uncertainty Estimation (GLUE) approach was used to estimate parameter values and uncertainties. The model performance was assessed on (i) its ability to simulate the contrasted patterns of stream flow and stream nitrate concentrations at seasonal and inter-annual time scales, (ii) its ability to simulate the patterns observed in groundwater at the same temporal scales, and (iii) the consistency of long-term simulations using the calibrated model and the general pattern of the nitrate concentration increase in the region since the beginning of the intensification of agriculture in the 1960s. The simulated nitrate transit times were found more sensitive to climate variability than to parameter uncertainty, and average values were found to be consistent with results from others studies in the same region involving modeling and groundwater dating. This study shows that a simple model can be used to simulate the main dynamics of nitrogen in an intensively polluted catchment and then be used to estimate the transit times of these pollutants in the system which is crucial to guide mitigation plans design and assessment. (C) 2015 Elsevier B.V. All rights reserved.
Resumo:
Prediction of queue waiting times of jobs submitted to production parallel batch systems is important to provide overall estimates to users and can also help meta-schedulers make scheduling decisions. In this work, we have developed a framework for predicting ranges of queue waiting times for jobs by employing multi-class classification of similar jobs in history. Our hierarchical prediction strategy first predicts the point wait time of a job using dynamic k-Nearest Neighbor (kNN) method. It then performs a multi-class classification using Support Vector Machines (SVMs) among all the classes of the jobs. The probabilities given by the SVM for the class predicted using k-NN and its neighboring classes are used to provide a set of ranges of predicted wait times with probabilities. We have used these predictions and probabilities in a meta-scheduling strategy that distributes jobs to different queues/sites in a multi-queue/grid environment for minimizing wait times of the jobs. Experiments with different production supercomputer job traces show that our prediction strategies can give correct predictions for about 77-87% of the jobs, and also result in about 12% improved accuracy when compared to the next best existing method. Experiments with our meta-scheduling strategy using different production and synthetic job traces for various system sizes, partitioning schemes and different workloads, show that the meta-scheduling strategy gives much improved performance when compared to existing scheduling policies by reducing the overall average queue waiting times of the jobs by about 47%.
Resumo:
Complex systems inspired analysis suggests a hypothesis that financial meltdowns are abrupt critical transitions that occur when the system reaches a tipping point. Theoretical and empirical studies on climatic and ecological dynamical systems have shown that approach to tipping points is preceded by a generic phenomenon called critical slowing down, i.e. an increasingly slow response of the system to perturbations. Therefore, it has been suggested that critical slowing down may be used as an early warning signal of imminent critical transitions. Whether financial markets exhibit critical slowing down prior to meltdowns remains unclear. Here, our analysis reveals that three major US (Dow Jones Index, S&P 500 and NASDAQ) and two European markets (DAX and FTSE) did not exhibit critical slowing down prior to major financial crashes over the last century. However, all markets showed strong trends of rising variability, quantified by time series variance and spectral function at low frequencies, prior to crashes. These results suggest that financial crashes are not critical transitions that occur in the vicinity of a tipping point. Using a simple model, we argue that financial crashes are likely to be stochastic transitions which can occur even when the system is far away from the tipping point. Specifically, we show that a gradually increasing strength of stochastic perturbations may have caused to abrupt transitions in the financial markets. Broadly, our results highlight the importance of stochastically driven abrupt transitions in real world scenarios. Our study offers rising variability as a precursor of financial meltdowns albeit with a limitation that they may signal false alarms.
Resumo:
[ES] Actualmente, las universidades se enfrentan a aumentos en sus necesidades de financiación, cuando el criterio predominante es que los poderes públicos tengan una menor presencia en la misma. Por otra parte, algunos investigadores indican que la Universidad parece estar perdiendo posiciones como centro de generación de conocimiento, el cual, como recurso en s. mismo, este siendo objeto de creciente atención por parte de las empresas y los gobiernos, considerándose le el recurso clave en la consecución de ventajas competitivas. En estas circunstancias, estimamos que la Universidad debe hacer mayores esfuerzos en orientar su investigación a la aplicación del conocimiento científico-técnico, en colaboración con las empresas y las administraciones públicas, como indica el modelo de la triple hélice de Etzkowitz y Leydesdorff. En este trabajo se presenta también como se afronta en la Universidad del País Vasco/Euskal Herriko Unibertsitatea la gestión del conocimiento científico técnico, as. como un proyecto de investigación cuya finalidad consiste en la búsqueda de un modelo de gestión de este conocimiento en el contexto descrito.
Resumo:
This paper analyzes auctions where bidders face nancial constraints that may force them to resell part of the property of the good (or subcontract part of a project) at a resale market. First we show that the ine¢ cient speculative equilibria of second- price auctions (Garratt and Tröger, 2006) generalizes to situations with partial resale where only the high value bidder is nancially constrained. However, when all players face nancial constraints the ine¢ cient speculative equilibria disappear. Therefore, for auctioning big facilities or contracts where all bidders are nancially constrained and there is a resale market, the second price auction remains a simple and appropriate mechanism to achieve an e¢ cient allocation.
Resumo:
An over view of the financial x-ray service. JL
Resumo:
In contrast to cost modeling activities, the pricing of services must be simple and transparent. Calculating and thus knowing price structures, would not only help identify the level of detail required for cost modeling of individual instititutions, but also help develop a ”public” market for services as well as clarify the division of task and the modeling of funding and revenue streams for data preservation of public institutions. This workshop has built on the results from the workshop ”The Costs and Benefits of Keeping Knowledge” which took place 11 June 2012 in Copenhagen. This expert workshop aimed at: •Identifying ways for data repositories to abstract from their complicated cost structures and arrive at one transparent pricing structure which can be aligned with available and plausible funding schemes. Those repositories will probably need a stable institutional funding stream for data management and preservation. Are there any estimates for this, absolute or as percentage of overall cost? Part of the revenue will probably have to come through data management fees upon ingest. How could that be priced? Per dataset, per GB or as a percentage of research cost? Will it be necessary to charge access prices, as they contradict the open science paradigm? •What are the price components for pricing individual services, which prices are currently being paid e.g. to commercial providers? What are the description and conditions of the service(s) delivered and guaranteed? •What types of risks are inherent in these pricing schemes? •How can services and prices be defined in an all-inclusive and simple manner, so as to enable researchers to apply for specific amount when asking for funding of data-intensive projects?Please