960 resultados para Averaging Theorem
Resumo:
This paper reworks and amplifies Reichert's proof of his theorem (1969) which asserts that any impedance function of a one-port electrical network which can be realised with two reactive elements and an arbitrary number of resistors can be realised with two reactive elements and three resistors. © 2012 Elsevier B.V. All rights reserved.
Resumo:
This paper studies the radiation properties of the immiscible blend of nylon1010 and HIPS. The gel fraction increased with increasing radiation dose. The network was found mostly in nylon1010, the networks were also found in both nylon1010 and HIPS when the dose reaches 0.85 MGy or more. We used the Charleby-Pinner equation and the modified Zhang-Sun-Qian equation to simulate the relationship with the dose and the sol fraction. The latter equation fits well with these polymer blends and the relationship used by it showed better linearity than the one by the Charleby-Pinner equation. We also studied the conditions of formation of the network by the mathematical expectation theorem for the binary system. Thermal properties of polymer blend were observed by DSC curves. The crystallization temperature decreases with increasing dose because the cross-linking reaction inhibited the crystallization procession and destroyed the crystals. The melting temperature also reduced with increasing radiation dose. The dual melting peak gradually shifted to single peak and the high melting peak disappeared at high radiation dose. However, the radiation-induced crystallization was observed by the heat of fusion increasing at low radiation dose. On the other hand, the crystal will be damaged by radiation. A similar conclusion may be drawn by the DSC traces when the polymer blends were crystallized. When the radiation dose increases, the heat of fusion reduces dramatically and so does the heat of crystallization. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
The digital divide continues to challenge political and academic circles worldwide. A range of policy solutions is briefly evaluated, from laissez-faire on the right to “arithmetic” egalitarianism on the left. The article recasts the digital divide as a problem for the social distribution of presumptively important information (e.g., electoral data, news, science) within postindustrial society. Endorsing in general terms the left-liberal approach of differential or “geometric” egalitarianism, it seeks to invest this with greater precision, and therefore utility, by means of a possibly original synthesis of the ideas of John Rawls and R. H. Tawney. It is argued that, once certain categories of information are accorded the status of “primary goods,” their distribution must then comply with principles of justice as articulated by those major 20th century exponents of ethical social democracy. The resultant Rawls-Tawney theorem, if valid, might augment the portfolio of options for interventionist information policy in the 21st century
Resumo:
Gough, John, 'Quantum Stratonovich Stochastic Calculus and the Quantum Wong-Zakai Theorem', Journal of Mathematical Physics. 47, 113509, (2006)
Resumo:
In this work we revisit the problem of the hedging of contingent claim using mean-square criterion. We prove that in incomplete market, some probability measure can be identified so that becomes -martingale under .This is in fact a new proposition on the martingale representation theorem. The new results also identify a weight function that serves to be an approximation to the Radon-Nikodým derivative of the unique neutral martingale measure.
Resumo:
Numerical approximation of the long time behavior of a stochastic di.erential equation (SDE) is considered. Error estimates for time-averaging estimators are obtained and then used to show that the stationary behavior of the numerical method converges to that of the SDE. The error analysis is based on using an associated Poisson equation for the underlying SDE. The main advantages of this approach are its simplicity and universality. It works equally well for a range of explicit and implicit schemes, including those with simple simulation of random variables, and for hypoelliptic SDEs. To simplify the exposition, we consider only the case where the state space of the SDE is a torus, and we study only smooth test functions. However, we anticipate that the approach can be applied more widely. An analogy between our approach and Stein's method is indicated. Some practical implications of the results are discussed. Copyright © by SIAM. Unauthorized reproduction of this article is prohibited.
Resumo:
This chapter presents a model averaging approach in the M-open setting using sample re-use methods to approximate the predictive distribution of future observations. It first reviews the standard M-closed Bayesian Model Averaging approach and decision-theoretic methods for producing inferences and decisions. It then reviews model selection from the M-complete and M-open perspectives, before formulating a Bayesian solution to model averaging in the M-open perspective. It constructs optimal weights for MOMA:M-open Model Averaging using a decision-theoretic framework, where models are treated as part of the ‘action space’ rather than unknown states of nature. Using ‘incompatible’ retrospective and prospective models for data from a case-control study, the chapter demonstrates that MOMA gives better predictive accuracy than the proxy models. It concludes with open questions and future directions.