19 resultados para Four body problem

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formulation of four-dimensional variational data assimilation allows the incorporation of constraints into the cost function which need only be weakly satisfied. In this paper we investigate the value of imposing conservation properties as weak constraints. Using the example of the two-body problem of celestial mechanics we compare weak constraints based on conservation laws with a constraint on the background state.We show how the imposition of conservation-based weak constraints changes the nature of the gradient equation. Assimilation experiments demonstrate how this can add extra information to the assimilation process, even when the underlying numerical model is conserving.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The perspex machine arose from the unification of projective geometry with the Turing machine. It uses a total arithmetic, called transreal arithmetic, that contains real arithmetic and allows division by zero. Transreal arithmetic is redefined here. The new arithmetic has both a positive and a negative infinity which lie at the extremes of the number line, and a number nullity that lies off the number line. We prove that nullity, 0/0, is a number. Hence a number may have one of four signs: negative, zero, positive, or nullity. It is, therefore, impossible to encode the sign of a number in one bit, as floating-, point arithmetic attempts to do, resulting in the difficulty of having both positive and negative zeros and NaNs. Transrational arithmetic is consistent with Cantor arithmetic. In an extension to real arithmetic, the product of zero, an infinity, or nullity with its reciprocal is nullity, not unity. This avoids the usual contradictions that follow from allowing division by zero. Transreal arithmetic has a fixed algebraic structure and does not admit options as IEEE, floating-point arithmetic does. Most significantly, nullity has a simple semantics that is related to zero. Zero means "no value" and nullity means "no information." We argue that nullity is as useful to a manufactured computer as zero is to a human computer. The perspex machine is intended to offer one solution to the mind-body problem by showing how the computable aspects of mind and. perhaps, the whole of mind relates to the geometrical aspects of body and, perhaps, the whole of body. We review some of Turing's writings and show that he held the view that his machine has spatial properties. In particular, that it has the property of being a 7D lattice of compact spaces. Thus, we read Turing as believing that his machine relates computation to geometrical bodies. We simplify the perspex machine by substituting an augmented Euclidean geometry for projective geometry. This leads to a general-linear perspex-machine which is very much easier to pro-ram than the original perspex-machine. We then show how to map the whole of perspex space into a unit cube. This allows us to construct a fractal of perspex machines with the cardinality of a real-numbered line or space. This fractal is the universal perspex machine. It can solve, in unit time, the halting problem for itself and for all perspex machines instantiated in real-numbered space, including all Turing machines. We cite an experiment that has been proposed to test the physical reality of the perspex machine's model of time, but we make no claim that the physical universe works this way or that it has the cardinality of the perspex machine. We leave it that the perspex machine provides an upper bound on the computational properties of physical things, including manufactured computers and biological organisms, that have a cardinality no greater than the real-number line.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Six parameters uniquely describe the orbit of a body about the Sun. Given these parameters, it is possible to make predictions of the body's position by solving its equation of motion. The parameters cannot be directly measured, so they must be inferred indirectly by an inversion method which uses measurements of other quantities in combination with the equation of motion. Inverse techniques are valuable tools in many applications where only noisy, incomplete, and indirect observations are available for estimating parameter values. The methodology of the approach is introduced and the Kepler problem is used as a real-world example. (C) 2003 American Association of Physics Teachers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first mycetome was discovered more than 340 yr ago in the human louse. Despite the remarkable biology and medical and social importance of human lice, its primary endosymbiont has eluded identification and characterization. Here, we report the host-symbiont interaction of the mycetomic bacterium of the head louse Pediculus humanus capitis and the body louse P. h. humanus. The endosymbiont represents a new bacterial lineage in the -Proteobacteria. Its closest sequenced relative is Arsenophonus nasoniae, from which it differs by more than 10%. A. nasoniae is a male-killing endosymbiont of jewel wasps. Using microdissection and multiphoton confocal microscopy, we show the remarkable interaction of this bacterium with its host. This endosymbiont is unique because it occupies sequentially four different mycetomes during the development of its host, undergoes three cycles of proliferation, changes in length from 2–4 µm to more than 100 µm, and has two extracellular migrations, during one of which the endosymbionts have to outrun its host’s immune cells. The host and its symbiont have evolved one of the most complex interactions: two provisional or transitory mycetomes, a main mycetome and a paired filial mycetome. Despite the close relatedness of body and head lice, differences are present in the mycetomic provisioning and the immunological response.—Perotti, M. A., Allen, J. M., Reed, D. L., Braig, H. R. Host-symbiont interactions of the primary endosymbiont of human head and body lice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Soy isoflavones show structural and functional similarities to estradiol. Available data indicate that estradiol and estradiol-like components may interact with gut "satiety hormones" such as peptide YY (PYY) and ghrelin, and thus influence body weight. In a randomized, double-blind, placebo-controlled, cross-over trial with 34 healthy postmenopausal women (59 ± 6 years, BMI: 24.7 ± 2.8 kg/m2), isoflavone-enriched cereal bars (50 mg isoflavones/day; genistein to daidzein ratio 2:1) or non-isoflavone-enriched control bars were consumed for 8 weeks (wash-out period: 8-weeks). Seventeen of the subjects were classified as equol producers. Plasma concentrations of ghrelin and PYY, as well as energy intake and body weight were measured at baseline and after four and eight weeks of each intervention arm. Results: Body weight increased in both treatment periods (isoflavone: 0.40 ± 0.94 kg, P < 0.001; placebo: 0.66 ± 0.87 kg, P = 0.018), with no significant difference between treatments. No significant differences in energy intake were observed (P = 0.634). PYY significantly increased during isoflavone treatment (51 ± 2 pmol/L vs. 55 ± 2 pmol/L), but not during placebo (52 ± 3 pmol/L vs. 50 ± 2 pmol/L), (P = 0.010 for treatment differences, independent of equol production). Baseline plasma ghrelin was significantly lower in equol producers (110 ± 16 pmol/L) than in equol non-producers (162 ± 17 pmol/L; P = 0.025). Conclusion: Soy isoflavone supplementation for eight weeks did not significantly reduce energy intake or body weight, even though plasma PYY increased during isoflavone treatment. Ghrelin remained unaffected by isoflavone treatment. A larger and more rigorous appetite experiment might detect smaller differences in energy intake after isoflavone consumption. However, the results of the present study do not indicate that increased PYY has a major role in the regulation of body weight, at least in healthy postmenopausal women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background & aims: This study investigated the influence of four commercial lipid emulsions, Ivelip, ClinOleic, Omegaven and SMOFlipid (R), on lipid body formation, fatty acid composition and eicosanoid production by cultured human peripheral blood polymorphonuclear cells (PMN) and mononuclear cells (PBMC). Methods: PMN and PBMC were exposed to emulsions at concentrations ranging from 0.01 to 0.04%. Lipid body formation was assessed by microscopy, fatty acid composition by gas chromatography and eicosanoids by ELISA. Results: Stimulation of inflammatory cells and exposure to lipid emulsions promoted the formation of lipid bodies, but there did not appear to be differential effects of the emulsions tested. In contrast, there were differential effects of lipid emulsions on eicosanoid formation, particularly with regards to LTB4 production by PMN. Omegaven dramatically increased production of eicosanoids compared with the other emulsions in a dose-dependent manner. This effect was associated with a significantly higher level of lipid peroxides in the supernatants of cells exposed to Omegaven. Conclusions: Stimulation of inflammatory cells and exposure to lipid emulsions promotes lipid body formation and eicosanoid production, although the differential effects of different emulsions appear to be largely due to lipid peroxidation of unsaturated fatty acids in some emulsions in this in vitro system. (C) 2009 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The well-studied link between psychotic traits and creativity is a subject of much debate. The present study investigated the extent to which schizotypic personality traits - as measured by O-LIFE (Oxford-Liverpool Inventory of Feelings and Experiences) - equip healthy individuals to engage as groups in everyday tasks. From a sample of 69 students, eight groups of four participants - comprised of high, medium, or low-schizotypy individuals - were assembled to work as a team to complete a creative problem-solving task. Predictably, high scorers on the O-LIFE formulated a greater number of strategies to solve the task, indicative of creative divergent thinking. However, for task success (as measured by time taken to complete the problem) an inverted U shaped pattern emerged, whereby high and low-schizotypy groups were consistently faster than medium schizotypy groups. Intriguing data emerged concerning leadership within the groups, and other tangential findings relating to anxiety, competition and motivation were explored. These findings challenge the traditional cliche that psychotic personality traits are linearly related to creative performance, and suggest that the nature of the problem determines which thinking styles are optimally equipped to solve it. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper tackles the problem of computing smooth, optimal trajectories on the Euclidean group of motions SE(3). The problem is formulated as an optimal control problem where the cost function to be minimized is equal to the integral of the classical curvature squared. This problem is analogous to the elastic problem from differential geometry and thus the resulting rigid body motions will trace elastic curves. An application of the Maximum Principle to this optimal control problem shifts the emphasis to the language of symplectic geometry and to the associated Hamiltonian formalism. This results in a system of first order differential equations that yield coordinate free necessary conditions for optimality for these curves. From these necessary conditions we identify an integrable case and these particular set of curves are solved analytically. These analytic solutions provide interpolating curves between an initial given position and orientation and a desired position and orientation that would be useful in motion planning for systems such as robotic manipulators and autonomous-oriented vehicles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in the last four decades has brought a considerable advance in our understanding of how the brain synthesizes information arising from different sensory modalities. Indeed, many cortical and subcortical areas, beyond those traditionally considered to be ‘associative,’ have been shown to be involved in multisensory interaction and integration (Ghazanfar and Schroeder 2006). Visuo-tactile interaction is of particular interest, because of the prominent role played by vision in guiding our actions and anticipating their tactile consequences in everyday life. In this chapter, we focus on the functional role that visuo-tactile processing may play in driving two types of body-object interactions: avoidance and approach. We will first review some basic features of visuo-tactile interactions, as revealed by electrophysiological studies in monkeys. These will prove to be relevant for interpreting the subsequent evidence arising from human studies. A crucial point that will be stressed is that these visuo-tactile mechanisms have not only sensory, but also motor-related activity that qualifies them as multisensory-motor interfaces. Evidence will then be presented for the existence of functionally homologous processing in the human brain, both from neuropsychological research in brain-damaged patients and in healthy participants. The final part of the chapter will focus on some recent studies in humans showing that the human motor system is provided with a multisensory interface that allows for continuous monitoring of the space near the body (i.e., peripersonal space). We further demonstrate that multisensory processing can be modulated on-line as a consequence of interacting with objects. This indicates that, far from being passive, the monitoring of peripersonal space is an active process subserving actions between our body and objects located in the space around us.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical weather prediction can be regarded as an initial value problem whereby the governing atmospheric equations are integrated forward from fully determined initial values of the meteorological parameters. However, in spite of the considerable improvements of the observing systems in recent years, the initial values are known only incompletely and inaccurately and one of the major tasks of any forecasting centre is to determine the best possible initial state from available observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of spurious excitation of gravity waves in the context of four-dimensional data assimilation is investigated using a simple model of balanced dynamics. The model admits a chaotic vortical mode coupled to a comparatively fast gravity wave mode, and can be initialized such that the model evolves on a so-called slow manifold, where the fast motion is suppressed. Identical twin assimilation experiments are performed, comparing the extended and ensemble Kalman filters (EKF and EnKF, respectively). The EKF uses a tangent linear model (TLM) to estimate the evolution of forecast error statistics in time, whereas the EnKF uses the statistics of an ensemble of nonlinear model integrations. Specifically, the case is examined where the true state is balanced, but observation errors project onto all degrees of freedom, including the fast modes. It is shown that the EKF and EnKF will assimilate observations in a balanced way only if certain assumptions hold, and that, outside of ideal cases (i.e., with very frequent observations), dynamical balance can easily be lost in the assimilation. For the EKF, the repeated adjustment of the covariances by the assimilation of observations can easily unbalance the TLM, and destroy the assumptions on which balanced assimilation rests. It is shown that an important factor is the choice of initial forecast error covariance matrix. A balance-constrained EKF is described and compared to the standard EKF, and shown to offer significant improvement for observation frequencies where balance in the standard EKF is lost. The EnKF is advantageous in that balance in the error covariances relies only on a balanced forecast ensemble, and that the analysis step is an ensemble-mean operation. Numerical experiments show that the EnKF may be preferable to the EKF in terms of balance, though its validity is limited by ensemble size. It is also found that overobserving can lead to a more unbalanced forecast ensemble and thus to an unbalanced analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1 Insects using olfactory stimuli to forage for prey/hosts are proposed to encounter a ‘reliability–detectability problem’, where the usability of a stimulus depends on its reliability as an indicator of herbivore presence and its detectability. 2 We investigated this theory using the responses of female seven-spot ladybirds Coccinella septempunctata (Coleoptera: Coccinellidae) to plant headspace chemicals collected from the peach-potato aphid Myzus persicae and four commercially available Brassica cultivars; Brassica rapa L. cultivar ‘turnip purple top’, Brassica juncea L. cultivar ‘red giant mustard’, Brassica napus L. cultivar ‘Apex’, Brassica napus L. cultivar ‘Courage’ and Arabidopsis thaliana. For each cultivar/species, responses to plants that were undamaged, previously infested by M. persicae and infested with M. persicae, were investigated using dual-choice Petri dish bioassays and circular arenas. 3 There was no evidence that ladybirds responded to headspace chemicals from aphids alone. Ladybirds significantly preferred headspace chemicals from B. napus cv. Apex that were undamaged compared with those from plants infested with aphids. For the other four species/cultivars, there was a consistent trend of the predators being recorded more often in the half of the Petri dish containing plant headspace chemicals from previously damaged and infested plants compared with those from undamaged ones. Furthermore, the mean distance ladybirds walked to reach aphid-infested A. thaliana was significantly shorter than to reach undamaged plants. These results suggest that aphid-induced plant chemicals could act as an arrestment or possibly an attractant stimulus to C. septempunctata. However, it is also possible that C. septempunctata could have been responding to aphid products, such as honeydew, transferred to the previously damaged and infested plants. 4 The results provide evidence to support the ‘reliability–detectability’ theory and suggest that the effectiveness of C. septempunctata as a natural enemy of aphids may be strongly affected by which species and cultivar of Brassica are being grown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the mathematical development of a body-centric nonlinear dynamic model of a quadrotor UAV that is suitable for the development of biologically inspired navigation strategies. Analytical approximations are used to find an initial guess of the parameters of the nonlinear model, then parameter estimation methods are used to refine the model parameters using the data obtained from onboard sensors during flight. Due to the unstable nature of the quadrotor model, the identification process is performed with the system in closed-loop control of attitude angles. The obtained model parameters are validated using real unseen experimental data. Based on the identified model, a Linear-Quadratic (LQ) optimal tracker is designed to stabilize the quadrotor and facilitate its translational control by tracking body accelerations. The LQ tracker is tested on an experimental quadrotor UAV and the obtained results are a further means to validate the quality of the estimated model. The unique formulation of the control problem in the body frame makes the controller better suited for bio-inspired navigation and guidance strategies than conventional attitude or position based control systems that can be found in the existing literature.