879 resultados para Egocentric Constraint
Resumo:
A new Bayesian algorithm for retrieving surface rain rate from Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) over the ocean is presented, along with validations against estimates from the TRMM Precipitation Radar (PR). The Bayesian approach offers a rigorous basis for optimally combining multichannel observations with prior knowledge. While other rain-rate algorithms have been published that are based at least partly on Bayesian reasoning, this is believed to be the first self-contained algorithm that fully exploits Bayes’s theorem to yield not just a single rain rate, but rather a continuous posterior probability distribution of rain rate. To advance the understanding of theoretical benefits of the Bayesian approach, sensitivity analyses have been conducted based on two synthetic datasets for which the “true” conditional and prior distribution are known. Results demonstrate that even when the prior and conditional likelihoods are specified perfectly, biased retrievals may occur at high rain rates. This bias is not the result of a defect of the Bayesian formalism, but rather represents the expected outcome when the physical constraint imposed by the radiometric observations is weak owing to saturation effects. It is also suggested that both the choice of the estimators and the prior information are crucial to the retrieval. In addition, the performance of the Bayesian algorithm herein is found to be comparable to that of other benchmark algorithms in real-world applications, while having the additional advantage of providing a complete continuous posterior probability distribution of surface rain rate.
Resumo:
We consider problems of splitting and connectivity augmentation in hypergraphs. In a hypergraph G = (V +s, E), to split two edges su, sv, is to replace them with a single edge uv. We are interested in doing this in such a way as to preserve a defined level of connectivity in V . The splitting technique is often used as a way of adding new edges into a graph or hypergraph, so as to augment the connectivity to some prescribed level. We begin by providing a short history of work done in this area. Then several preliminary results are given in a general form so that they may be used to tackle several problems. We then analyse the hypergraphs G = (V + s, E) for which there is no split preserving the local-edge-connectivity present in V. We provide two structural theorems, one of which implies a slight extension to Mader’s classical splitting theorem. We also provide a characterisation of the hypergraphs for which there is no such “good” split and a splitting result concerned with a specialisation of the local-connectivity function. We then use our splitting results to provide an upper bound on the smallest number of size-two edges we must add to any given hypergraph to ensure that in the resulting hypergraph we have λ(x, y) ≥ r(x, y) for all x, y in V, where r is an integer valued, symmetric requirement function on V*V. This is the so called “local-edge-connectivity augmentation problem” for hypergraphs. We also provide an extension to a Theorem of Szigeti, about augmenting to satisfy a requirement r, but using hyperedges. Next, in a result born of collaborative work with Zoltán Király from Budapest, we show that the local-connectivity augmentation problem is NP-complete for hypergraphs. Lastly we concern ourselves with an augmentation problem that includes a locational constraint. The premise is that we are given a hypergraph H = (V,E) with a bipartition P = {P1, P2} of V and asked to augment it with size-two edges, so that the result is k-edge-connected, and has no new edge contained in some P(i). We consider the splitting technique and describe the obstacles that prevent us forming “good” splits. From this we deduce results about which hypergraphs have a complete Pk-split. This leads to a minimax result on the optimal number of edges required and a polynomial algorithm to provide an optimal augmentation.
Resumo:
In this paper stability of one-step ahead predictive controllers based on non-linear models is established. It is shown that, under conditions which can be fulfilled by most industrial plants, the closed-loop system is robustly stable in the presence of plant uncertainties and input–output constraints. There is no requirement that the plant should be open-loop stable and the analysis is valid for general forms of non-linear system representation including the case out when the problem is constraint-free. The effectiveness of controllers designed according to the algorithm analyzed in this paper is demonstrated on a recognized benchmark problem and on a simulation of a continuous-stirred tank reactor (CSTR). In both examples a radial basis function neural network is employed as the non-linear system model.
Resumo:
The Water Framework Directive has caused a paradigm shift towards the integrated management of recreational water quality through the development of drainage basin-wide programmes of measures. This has increased the need for a cost-effective diagnostic tool capable of accurately predicting riverine faecal indicator organism (FIO) concentrations. This paper outlines the application of models developed to fulfil this need, which represent the first transferrable generic FIO models to be developed for the UK to incorporate direct measures of key FIO sources (namely human and livestock population data) as predictor variables. We apply a recently developed transfer methodology, which enables the quantification of geometric mean presumptive faecal coliforms and presumptive intestinal enterococci concentrations for base- and high-flow during the summer bathing season in unmonitored UK watercourses, to predict FIO concentrations in the Humber river basin district. Because the FIO models incorporate explanatory variables which allow the effects of policy measures which influence livestock stocking rates to be assessed, we carry out empirical analysis of the differential effects of seven land use management and policy instruments (fiscal constraint, production constraint, cost intervention, area intervention, demand-side constraint, input constraint, and micro-level land use management) all of which can be used to reduce riverine FIO concentrations. This research provides insights into FIO source apportionment, explores a selection of pollution remediation strategies and the spatial differentiation of land use policies which could be implemented to deliver river quality improvements. All of the policy tools we model reduce FIO concentrations in rivers but our research suggests that the installation of streamside fencing in intensive milk producing areas may be the single most effective land management strategy to reduce riverine microbial pollution.
Resumo:
The differential phase (ΦDP) measured by polarimetric radars is recognized to be a very good indicator of the path integrated by rain. Moreover, if a linear relationship is assumed between the specific differential phase (KDP) and the specific attenuation (AH) and specific differential attenuation (ADP), then attenuation can easily be corrected. The coefficients of proportionality, γH and γDP, are, however, known to be dependent in rain upon drop temperature, drop shapes, drop size distribution, and the presence of large drops causing Mie scattering. In this paper, the authors extensively apply a physically based method, often referred to as the “Smyth and Illingworth constraint,” which uses the constraint that the value of the differential reflectivity ZDR on the far side of the storm should be low to retrieve the γDP coefficient. More than 30 convective episodes observed by the French operational C-band polarimetric Trappes radar during two summers (2005 and 2006) are used to document the variability of γDP with respect to the intrinsic three-dimensional characteristics of the attenuating cells. The Smyth and Illingworth constraint could be applied to only 20% of all attenuated rays of the 2-yr dataset so it cannot be considered the unique solution for attenuation correction in an operational setting but is useful for characterizing the properties of the strongly attenuating cells. The range of variation of γDP is shown to be extremely large, with minimal, maximal, and mean values being, respectively, equal to 0.01, 0.11, and 0.025 dB °−1. Coefficient γDP appears to be almost linearly correlated with the horizontal reflectivity (ZH), differential reflectivity (ZDR), and specific differential phase (KDP) and correlation coefficient (ρHV) of the attenuating cells. The temperature effect is negligible with respect to that of the microphysical properties of the attenuating cells. Unusually large values of γDP, above 0.06 dB °−1, often referred to as “hot spots,” are reported for 15%—a nonnegligible figure—of the rays presenting a significant total differential phase shift (ΔϕDP > 30°). The corresponding strongly attenuating cells are shown to have extremely high ZDR (above 4 dB) and ZH (above 55 dBZ), very low ρHV (below 0.94), and high KDP (above 4° km−1). Analysis of 4 yr of observed raindrop spectra does not reproduce such low values of ρHV, suggesting that (wet) ice is likely to be present in the precipitation medium and responsible for the attenuation and high phase shifts. Furthermore, if melting ice is responsible for the high phase shifts, this suggests that KDP may not be uniquely related to rainfall rate but can result from the presence of wet ice. This hypothesis is supported by the analysis of the vertical profiles of horizontal reflectivity and the values of conventional probability of hail indexes.
Resumo:
A novel Neuropredictive Teleoperation (NPT) Scheme is presented. The design results from two key ideas: the exploitation of the measured or estimated neural input to the human arm or its electromyograph (EMG) as the system input and the employment of a predictor of the arm movement, based on this neural signal and an arm model, to compensate for time delays in the system. Although a multitude of such models, as well as measuring devices for the neural signals and the EMG, have been proposed, current telemanipulator research has only been considering highly simplified arm models. In the present design, the bilateral constraint that the master and slave are simultaneously compliant to each other's state (equal positions and forces) is abandoned, thus obtaining a simple to analyzesuccession of only locally controlled modules, and a robustness to time delays of up to 500 ms. The proposed designs were inspired by well established physiological evidence that the brain, rather than controlling the movement on-line, programs the arm with an action plan of a complete movement, which is then executed largely in open loop, regulated only by local reflex loops. As a model of the human arm the well-established Stark model is employed, whose mathematical representation is modified to make it suitable for an engineering application. The proposed scheme is however valid for any arm model. BIBO-stability and passivity results for a variety of local control laws are reported. Simulation results and comparisons with traditional designs also highlight the advantages of the proposed design.
Resumo:
The blind minimum output energy (MOE) adaptive detector for code division multiple access (CDMA) signals requires exact knowledge of the received spreading code of the desired user. This requirement can be relaxed by constraining the so-called surplus energy of the adaptive tap-weight vector, but the ideal constraint value is not easily obtained in practice. An algorithm is proposed to adaptively track this value and hence to approach the best possible performance for this class of CDMA detector.
Resumo:
Three potential explanations of past reforms of the Common Agricultural Policy (CAP) can be identified in the literature: a budget constraint, pressure from General Agreement on Tariffs and Trade/World Trade Organization (GATT/WTO) negotiations or commitments and a paradigm shift emphasising agriculture’s provision of public goods. This discussion on the driving forces of CAP reform links to broader theoretical questions on the role of budgetary politics, globalisation of public policy and paradigm shift in explaining policy change. In this article, the Health Check reforms of 2007/2008 are assessed. They were probably more ambitious than first supposed, although it was a watered-down package agreed by ministers in November 2008. We conclude that the Health Check was not primarily driven by budget concerns or by the supposed switch from the state-assisted to the multifunctional policy paradigm. The European Commission’s wish to adopt an offensive negotiating stance in the closing phases of the Doha Round was a more likely explanatory factor. The shape and purpose of the CAP post-2013 is contested with divergent views among the Member States.
Resumo:
This paper reports on a survey of 17 value management exercises recently carried out within the UK construction industry. Twelve leading value management practitioners were asked to describe an example of a value management study which ‘worked well’ and one which ‘did not work well’. They were further asked to explain the underlying factors which they considered had influenced the eventual outcome of the value management study. The subsequent analysis of the interview transcripts reveals six recurring themes which were held to have had a significant influence: expectations, implementation, participation, power, time constraint and uncertainty. Whilst caution is necessary in extracting the themes from their individual contexts, they do provide a valuable insight into the factors which influence the outcome of value management studies.
Resumo:
Strokes affect thousands of people worldwide leaving sufferers with severe disabilities affecting their daily activities. In recent years, new rehabilitation techniques have emerged such as constraint-induced therapy, biofeedback therapy and robot-aided therapy. In particular, robotic techniques allow precise recording of movements and application of forces to the affected limb, making it a valuable tool for motor rehabilitation. In addition, robot-aided therapy can utilise visual cues conveyed on a computer screen to convert repetitive movement practice into an engaging task such as a game. Visual cues can also be used to control the information sent to the patient about exercise performance and to potentially address psychosomatic variables influencing therapy. This paper overviews the current state-of-the-art on upper limb robot-mediated therapy with a focal point on the technical requirements of robotic therapy devices leading to the development of upper limb rehabilitation techniques that facilitate reach-to-touch, fine motor control, whole-arm movements and promote rehabilitation beyond hospital stay. The reviewed literature suggest that while there is evidence supporting the use of this technology to reduce functional impairment, besides the technological push, the challenge ahead lies on provision of effective assessment of outcome and modalities that have a stronger impact transferring functional gains into functional independence.
Resumo:
Identifying a periodic time-series model from environmental records, without imposing the positivity of the growth rate, does not necessarily respect the time order of the data observations. Consequently, subsequent observations, sampled in the environmental archive, can be inversed on the time axis, resulting in a non-physical signal model. In this paper an optimization technique with linear constraints on the signal model parameters is proposed that prevents time inversions. The activation conditions for this constrained optimization are based upon the physical constraint of the growth rate, namely, that it cannot take values smaller than zero. The actual constraints are defined for polynomials and first-order splines as basis functions for the nonlinear contribution in the distance-time relationship. The method is compared with an existing method that eliminates the time inversions, and its noise sensitivity is tested by means of Monte Carlo simulations. Finally, the usefulness of the method is demonstrated on the measurements of the vessel density, in a mangrove tree, Rhizophora mucronata, and the measurement of Mg/Ca ratios, in a bivalve, Mytilus trossulus.
Resumo:
Background. With diffusion-tensor imaging (DTi) it is possible to estimate the structural characteristics of fiber bundles in vivo. This study used DTi to infer damage to the corticospinal tract (CST) and relates this parameter to (a) the level of residual motor ability at least 1 year poststroke and (b) the outcome of intensive motor rehabilitation with constraint-induced movement therapy (CIMT). Objective. To explore the role of CST damage in recovery and CIMT efficacy. Methods. Ten patients with low-functioning hemiparesis were scanned and tested at baseline, before and after CIMT. Lesion overlap with the CST was indexed as reduced anisotropy compared with a CST variability map derived from 26 controls. Residual motor ability was measured through the Wolf Motor Function Test (WMFT) and the Motor Activity Log (MAL) acquired at baseline. CIMT benefit was assessed through the pre—post treatment comparison of WMFT and MAL performance. Results. Lesion overlap with the CST correlated with residual motor ability at baseline, with greater deficits observed in patients with more extended CST damage. Infarct volume showed no systematic association with residual motor ability. CIMT led to significant improvements in motor function but outcome was not associated with the extent of CST damage or infarct volume. Conclusion. The study gives in vivo support for the proposition that structural CST damage, not infarct volume, is a major predictor for residual functional ability in the chronic state. The results provide initial evidence for positive effects of CIMT in patients with varying, including more severe, CST damage.
Resumo:
Background: Poor diet quality is a major public health concern that has prompted governments to introduce a range of measures to promote healthy eating. For these measures to be effective, they should target segments of the population with messages relevant to their needs, aspirations and circumstances. The present study investigates the extent to which attitudes and constraints influence healthy eating, as well as how these vary by demographic characteristics of the UK population. It further considers how such information may be used in segmented diet and health policy messages. Methods: A survey of 250 UK adults elicited information on conformity to dietary guidelines, attitudes towards healthy eating, constraints to healthy eating and demographic characteristics. Ordered logit regressions were estimated to determine the importance of attitudes and constraints in determining how closely respondents follow healthy eating guidelines. Further regressions explored the demographic characteristics associated with the attitudinal and constraint variables. Results: People who attach high importance to their own health and appearance eat more healthily than those who do not. Risk-averse people and those able to resist temptation also eat more healthily. Shortage of time is considered an important barrier to healthy eating, although the cost of a healthy diet is not. These variables are associated with a number of demographic characteristics of the population; for example, young adults are more motivated to eat healthily by concerns over their appearance than their health. Conclusions: The approach employed in the present study could be used to inform future healthy eating campaigns. For example, messages to encourage the young to eat more healthily could focus on the impact of diets on their appearance rather than health.
Resumo:
Water vapour modulates energy flows in Earth's climate system through transfer of latent heat by evaporation and condensation and by modifying the flows of radiative energy both in the longwave and shortwave portions of the electromagnetic spectrum. This article summarizes the role of water vapour in Earth's energy flows with particular emphasis on (1) the powerful thermodynamic constraint of the Clausius Clapeyron equation, (2) dynamical controls on humidity above the boundary layer (or free-troposphere), (3) uncertainty in continuum absorption in the relatively transparent "window" regions of the radiative spectrum and (4) implications for changes in the atmospheric hydrological cycle.
Resumo:
During the Last Glacial Maximum (LGM, ∼21,000 years ago) the cold climate was strongly tied to low atmospheric CO2 concentration (∼190 ppm). Although it is generally assumed that this low CO2 was due to an expansion of the oceanic carbon reservoir, simulating the glacial level has remained a challenge especially with the additional δ13C constraint. Indeed the LGM carbon cycle was also characterized by a modern-like δ13C in the atmosphere and a higher surface to deep Atlantic δ13C gradient indicating probable changes in the thermohaline circulation. Here we show with a model of intermediate complexity, that adding three oceanic mechanisms: brine induced stratification, stratification-dependant diffusion and iron fertilization to the standard glacial simulation (which includes sea level drop, temperature change, carbonate compensation and terrestrial carbon release) decreases CO2 down to the glacial value of ∼190 ppm and simultaneously matches glacial atmospheric and oceanic δ13C inferred from proxy data. LGM CO2 and δ13C can at last be successfully reconciled.