921 resultados para localisation simple


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers a general and informationally efficient approach to determine the optimal access pricing rule for interconnected networks. It shows that there exists a simple rule that achieves the Ramsey outcome as the unique equilibrium when networks compete in linear prices without network-based price discrimination. The approach is informationally efficient in the sense that the regulator is required to know only the marginal cost structure, i.e. the marginal cost of making and terminating a call. The approach is general in that access prices can depend not only on the marginal costs but also on the retail prices, which can be observed by consumers and therefore by the regulator as well. In particular, I consider the set of linear access pricing rules which includes any fixed access price, the Efficient Component Pricing Rule (ECPR) and the Modified ECPR as special cases. I show that in this set, there is a unique rule that implements the Ramsey outcome as the unique equilibrium independently of the underlying demand conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper shows how risk may aggravate fluctuations in economies with imperfect insurance and multiple assets. A two period job matching model is studied, in which risk averse agents act both as workers and as entrepreneurs. They choose between two types of investment: one type is riskless, while the other is a risky activity that creates jobs.Equilibrium is unique under full insurance. If investment is fully insured but unemployment risk is uninsured, then precautionary saving behavior dampens output fluctuations. However, if both investment and employment are uninsured, then an increase in unemployment gives agents an incentive to shift investment away from the risky asset, further increasing unemployment. This positive feedback may lead to multiple Pareto ranked equilibria. An overlapping generations version of the model may exhibit poverty traps or persistent multiplicity. Greater insurance is doubly beneficial in this context since it can both prevent multiplicity and promote risky investment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The n-octanol/water partition coefficient (log Po/w) is a key physicochemical parameter for drug discovery, design, and development. Here, we present a physics-based approach that shows a strong linear correlation between the computed solvation free energy in implicit solvents and the experimental log Po/w on a cleansed data set of more than 17,500 molecules. After internal validation by five-fold cross-validation and data randomization, the predictive power of the most interesting multiple linear model, based on two GB/SA parameters solely, was tested on two different external sets of molecules. On the Martel druglike test set, the predictive power of the best model (N = 706, r = 0.64, MAE = 1.18, and RMSE = 1.40) is similar to six well-established empirical methods. On the 17-drug test set, our model outperformed all compared empirical methodologies (N = 17, r = 0.94, MAE = 0.38, and RMSE = 0.52). The physical basis of our original GB/SA approach together with its predictive capacity, computational efficiency (1 to 2 s per molecule), and tridimensional molecular graphics capability lay the foundations for a promising predictor, the implicit log P method (iLOGP), to complement the portfolio of drug design tools developed and provided by the SIB Swiss Institute of Bioinformatics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a simple Optimised Search Heuristic for the Job Shop Scheduling problem that combines a GRASP heuristic with a branch-and-bound algorithm. The proposed method is compared with similar approaches and leads to better results in terms of solution quality and computing times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although correspondence analysis is now widely available in statistical software packages and applied in a variety of contexts, notably the social and environmental sciences, there are still some misconceptions about this method as well as unresolved issues which remain controversial to this day. In this paper we hope to settle these matters, namely (i) the way CA measures variance in a two-way table and how to compare variances between tables of different sizes, (ii) the influence, or rather lack of influence, of outliers in the usual CA maps, (iii) the scaling issue and the biplot interpretation of maps,(iv) whether or not to rotate a solution, and (v) statistical significance of results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a new time-domain test of a process being I(d), 0 < d = 1, under the null, against the alternative of being I(0) with deterministic components subject to structural breaks at known or unknown dates, with the goal of disentangling the existing identification issue between long-memory and structural breaks. Denoting by AB(t) the different types of structural breaks in the deterministic components of a time series considered by Perron (1989), the test statistic proposed here is based on the t-ratio (or the infimum of a sequence of t-ratios) of the estimated coefficient on yt-1 in an OLS regression of ?dyt on a simple transformation of the above-mentioned deterministic components and yt-1, possibly augmented by a suitable number of lags of ?dyt to account for serial correlation in the error terms. The case where d = 1 coincides with the Perron (1989) or the Zivot and Andrews (1992) approaches if the break date is known or unknown, respectively. The statistic is labelled as the SB-FDF (Structural Break-Fractional Dickey- Fuller) test, since it is based on the same principles as the well-known Dickey-Fuller unit root test. Both its asymptotic behavior and finite sample properties are analyzed, and two empirical applications are provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los mapas de riesgo de inundaciones deberían mostrar las inundaciones en relación con los impactos potenciales que éstas pueden llegar a producir en personas, bienes y actividades. Por ello, es preciso añadir el concepto de vulnerabilidad al mero estudio del fenómeno físico. Así pues, los mapas de riesgo de daños por inundación son los verdaderos mapas de riesgo, ya que se elaboran, por una parte, a partir de cartografía que localiza y caracteriza el fenómeno físico de las inundaciones, y, por la otra, a partir de cartografía que localiza y caracteriza los elementos expuestos. El uso de las llamadas «nuevas tecnologías», como los SIG, la percepción remota, los sensores hidrológicos o Internet, representa un potencial de gran valor para el desarrollo de los mapas de riesgo de inundaciones, que es, hoy por hoy, un campo abierto a la investigación

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Collection : Encyclopédie populaire, ou Les sciences, les arts et les métiers mis à la portée de toutes les classes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: The prognostic impact of complete response (CR) achievement in multiple myeloma (MM) has been shown mostly in the context of autologous stem-cell transplantation. Other levels of response have been defined because, even with high-dose therapy, CR is a relatively rare event. The purpose of this study was to analyze the prognostic impact of very good partial response (VGPR) in patients treated with high-dose therapy. PATIENTS AND METHODS: All patients were included in the Intergroupe Francophone du Myelome 99-02 and 99-04 trials and treated with vincristine, doxorubicin, and dexamethasone (VAD) induction therapy followed by double autologous stem-cell transplantation (ASCT). Best post-ASCT response assessment was available for 802 patients. RESULTS: With a median follow-up of 67 months, median event-free survival (EFS) and 5-year EFS were 42 months and 34%, respectively, for 405 patients who achieved at least VGPR after ASCT versus 32 months and 26% in 288 patients who achieved only partial remission (P = .005). Five-year overall survival (OS) was significantly superior in patients achieving at least VGPR (74% v 61% P = .0017). In multivariate analysis, achievement of less than VGPR was an independent factor predicting shorter EFS and OS. Response to VAD had no impact on EFS and OS. The impact of VGPR achievement on EFS and OS was significant in patients with International Staging System stages 2 to 3 and for patients with poor-risk cytogenetics t(4;14) or del(17p). CONCLUSION: In the context of ASCT, achievement of at least VGPR is a simple prognostic factor that has importance in intermediate and high-risk MM and can be informative in more patients than CR.