953 resultados para Operator driven reliability
Resumo:
Vekua operators map harmonic functions defined on domain in \mathbb R2R2 to solutions of elliptic partial differential equations on the same domain and vice versa. In this paper, following the original work of I. Vekua (Ilja Vekua (1907–1977), Soviet-Georgian mathematician), we define Vekua operators in the case of the Helmholtz equation in a completely explicit fashion, in any space dimension N ≥ 2. We prove (i) that they actually transform harmonic functions and Helmholtz solutions into each other; (ii) that they are inverse to each other; and (iii) that they are continuous in any Sobolev norm in star-shaped Lipschitz domains. Finally, we define and compute the generalized harmonic polynomials as the Vekua transforms of harmonic polynomials. These results are instrumental in proving approximation estimates for solutions of the Helmholtz equation in spaces of circular, spherical, and plane waves.
Resumo:
We embark upon a systematic investigation of operator space structure of JC*-triples via a study of the TROs (ternary rings of operators) they generate. Our approach is to introduce and develop a variety of universal objects, including universal TROs, by which means we are able to describe all possible operator space structures of a JC*-triple. Via the concept of reversibility we obtain characterisations of universal TROs over a wide range of examples. We apply our results to obtain explicit descriptions of operator space structures of Cartan factors regardless of dimension
Resumo:
Operator spaces of Hilbertian JC∗ -triples E are considered in the light of the universal ternary ring of operators (TRO) introduced in recent work. For these operator spaces, it is shown that their triple envelope (in the sense of Hamana) is the TRO they generate, that a complete isometry between any two of them is always the restriction of a TRO isomorphism and that distinct operator space structures on a fixed E are never completely isometric. In the infinite-dimensional cases, operator space structure is shown to be characterized by severe and definite restrictions upon finite-dimensional subspaces. Injective envelopes are explicitly computed.
Resumo:
Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.
Resumo:
Scoring rules are an important tool for evaluating the performance of probabilistic forecasting schemes. A scoring rule is called strictly proper if its expectation is optimal if and only if the forecast probability represents the true distribution of the target. In the binary case, strictly proper scoring rules allow for a decomposition into terms related to the resolution and the reliability of a forecast. This fact is particularly well known for the Brier Score. In this article, this result is extended to forecasts for finite-valued targets. Both resolution and reliability are shown to have a positive effect on the score. It is demonstrated that resolution and reliability are directly related to forecast attributes that are desirable on grounds independent of the notion of scores. This finding can be considered an epistemological justification of measuring forecast quality by proper scoring rules. A link is provided to the original work of DeGroot and Fienberg, extending their concepts of sufficiency and refinement. The relation to the conjectured sharpness principle of Gneiting, et al., is elucidated.
Resumo:
References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.
Resumo:
This paper reports the results of a parametric CFD study on idealized city models to investigate the potential of slope flow in ventilating a city located in a mountainous region when the background synoptic wind is absent. Examples of such a city include Tokyo in Japan, Los Angeles and Phoenix in the US, and Hong Kong. Two types of buoyancy-driven flow are considered, i.e., slope flow from the mountain slope (katabatic wind at night and anabatic wind in the daytime), and wall flow due to heated/cooled urban surfaces. The combined buoyancy-driven flow system can serve the purpose of dispersing the accumulated urban air pollutants when the background wind is weak or absent. The microscopic picture of ventilation performance within the urban structures was evaluated in terms of air change rate (ACH) and age of air. The simulation results reveal that the slope flow plays an important role in ventilating the urban area, especially in calm conditions. Katabatic flow at night is conducive to mitigating the nocturnal urban heat island. In the present parametric study, the mountain slope angle and mountain height are assumed to be constant, and the changing variables are heating/cooling intensity and building height. For a typical mountain of 500 m inclined at an angle of 20° to the horizontal level, the interactive structure is very much dependent on the ratio of heating/cooling intensity as well as building height. When the building is lower than 60 m, the slope wind dominates. When the building is as high as 100 m, the contribution from the urban wall flow cannot be ignored. It is found that katabatic wind can be very beneficial to the thermal environment as well as air quality at the pedestrian level. The air change rate for the pedestrian volume can be as high as 300 ACH.
Resumo:
This article examines the problems of elite capture in community driven development (CDD). Drawing on two case studies of non-governmental organisation (NGO) intervention in rural Mozambique, the authors consider two important variables – 1) the diverse and complex contributions of local elites to CDD in different locations, and 2) the roles that non-elites play in monitoring and controlling leader activities – to argue that donors should be cautious about automatically assuming the prevalence of malevolent patrimonialism and its ill-effects in their projects. This is because the ‘checks and balances’ on elite behaviour that exist within locally-defined and historically-rooted forms of community-based governance are likely to be more effective than those introduced by the external intervener.
Resumo:
Natural ventilation relies on less controllable natural forces so that it needs more artificial control, and thus its prediction, design and analysis become more important. This paper presents both theoretical and numerical simulations for predicting the natural ventilation flow in a two-zone building with multiple openings which is subjected to the combined natural forces. To our knowledge, this is the first analytical solutions obtained so far for a building with more than one zones and in each zone with possibly more than 2 openings. The analytical solution offers a possibility for validating a multi-zone airflow program. A computer program MIX is employed to conduct the numerical simulation. Good agreement is achieved. Different airflow modes are identified and some design recommendations are also provided.
Resumo:
Abstract. We prove that the vast majority of JC∗-triples satisfy the condition of universal reversibility. Our characterisation is that a JC∗-triple is universally reversible if and only if it has no triple homomorphisms onto Hilbert spaces of dimension greater than two nor onto spin factors of dimension greater than four. We establish corresponding characterisations in the cases of JW∗-triples and of TROs (regarded as JC∗-triples). We show that the distinct natural operator space structures on a universally reversible JC∗-triple E are in bijective correspondence with a distinguished class of ideals in its universal TRO, identify the Shilov boundaries of these operator spaces and prove that E has a unique natural operator space structure precisely when E contains no ideal isometric to a nonabelian TRO. We deduce some decomposition and completely contractive properties of triple homomorphisms on TROs.
Resumo:
Stroke is a medical emergency and can cause a neurological damage, affecting the motor and sensory systems. Harnessing brain plasticity should make it possible to reconstruct the closed loop between the brain and the body, i.e., association of the generation of the motor command with the somatic sensory feedback might enhance motor recovery. In order to aid reconstruction of this loop with a robotic device it is necessary to assist the paretic side of the body at the right moment to achieve simultaneity between motor command and feedback signal to somatic sensory area in brain. To this end, we propose an integrated EEG-driven assistive robotic system for stroke rehabilitation. Depending on the level of motor recovery, it is important to provide adequate stimulation for upper limb motion. Thus, we propose an assist arm incorporating a Magnetic Levitation Joint that can generate a compliant motion due to its levitation and mechanical redundancy. This paper reports on a feasibility study carried out to verify the validity of the robot sensing and on EEG measurements conducted with healthy volunteers while performing a spontaneous arm flexion/extension movement. A characteristic feature was found in the temporal evolution of EEG signal in the single motion prior to executed motion which can aid in coordinating timing of the robotic arm assistance onset.
Resumo:
In this paper we study Dirichlet convolution with a given arithmetical function f as a linear mapping 'f that sends a sequence (an) to (bn) where bn = Pdjn f(d)an=d.
We investigate when this is a bounded operator on l2 and ¯nd the operator norm. Of particular interest is the case f(n) = n¡® for its connection to the Riemann zeta
function on the line 1, 'f is bounded with k'f k = ³(®). For the unbounded case, we show that 'f : M2 ! M2 where M2 is the subset of l2 of multiplicative sequences, for many f 2 M2. Consequently, we study the `quasi'-norm sup kak = T a 2M2 k'fak kak
for large T, which measures the `size' of 'f on M2. For the f(n) = n¡® case, we show this quasi-norm has a striking resemblance to the conjectured maximal order of
j³(® + iT )j for ® > 12 .
Resumo:
A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.