40 resultados para Simplicity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a FORTRAN-90 program that computes scattering t-matrices for a molecule. These can be used in a Low-Energy Electron Diffraction program to solve the molecular structural problem very efficiently. The intramolecular multiple scattering is computed within a Dyson-like approach, using free space Green propagators in a basis of spherical waves. The advantage of this approach is related to exploiting the chemical identity of the molecule, and to the simplicity to translate and rotate these t-matrices without performing a new multiple-scattering calculation for each configuration. FORTRAN-90 routines for rotating the resulting t-matrices using Wigner matrices are also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this chapter we described how the inclusion of a model of a human arm, combined with the measurement of its neural input and a predictor, can provide to a previously proposed teleoperator design robustness under time delay. Our trials gave clear indications of the superiority of the NPT scheme over traditional as well as the modified Yokokohji and Yoshikawa architectures. Its fundamental advantages are: the time-lead of the slave, the more efficient, and providing a more natural feeling manipulation, and the fact that incorporating an operator arm model leads to more credible stability results. Finally, its simplicity allows less likely to fail local control techniques to be employed. However, a significant advantage for the enhanced Yokokohji and Yoshikawa architecture results from the very fact that it’s a conservative modification of current designs. Under large prediction errors, it can provide robustness through directing the master and slave states to their means and, since it relies on the passivity of the mechanical part of the system, it would not confuse the operator. An experimental implementation of the techniques will provide further evidence for the performance of the proposed architectures. The employment of neural networks and fuzzy logic, which will provide an adaptive model of the human arm and robustifying control terms, is scheduled for the near future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sensitive methods that are currently used to monitor proteolysis by plasmin in milk are limited due to 7 their high cost and lack of standardisation for quality assurance in the various dairy laboratories. In 8 this study, four methods, trinitrobenzene sulphonic acid (TNBS), reverse phase high pressure liquid 9 chromatography (RP-HPLC), gel electrophoresis and fluorescamine, were selected to assess their 10 suitability for the detection of proteolysis in milk by plasmin. Commercial UHT milk was incubated 11 with plasmin at 37 °C for one week. Clarification was achieved by isoelectric precipitation (pH 4·6 12 soluble extracts)or 6% (final concentration) trichloroacetic acid (TCA). The pH 4·6 and 6% TCA 13 soluble extracts of milk showed high correlations (R2 > 0·93) by the TNBS, fluorescamine and 14 RP-HPLC methods, confirming increased proteolysis during storage. For gel electrophoresis,15 extensive proteolysis was confirmed by the disappearance of α- and β-casein bands on the seventh 16 day, which was more evident in the highest plasmin concentration. This was accompanied by the 17 appearance of α- and β-casein proteolysis products with higher intensities than on previous days, 18 implying that more products had been formed as a result of casein breakdown. The fluorescamine 19 method had a lower detection limit compared with the other methods, whereas gel electrophoresis 20 was the best qualitative method for monitoring β-casein proteolysis products. Although HPLC was the 21 most sensitive, the TNBS method is recommended for use in routine laboratory analysis on the basis 22 of its accuracy, reliability and simplicity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pesticide risk indicators provide simple support in the assessment of environmental and health risks from pesticide use, and can therefore inform policies to foster a sustainable interaction of agriculture with the environment. For their relative simplicity, indicators may be particularly useful under conditions of limited data availability and resources, such as in Less Developed Countries (LDCs). However, indicator complexity can vary significantly, in particular between those that rely on an exposure–toxicity ratio (ETR) and those that do not. In addition, pesticide risk indicators are usually developed for Western contexts, which might cause incorrect estimation in LDCs. This study investigated the appropriateness of seven pesticide risk indicators for use in LDCs, with reference to smallholding agriculture in Colombia. Seven farm-level indicators, among which 3 relied on an ETR (POCER, EPRIP, PIRI) and 4 on a non-ETR approach (EIQ, PestScreen, OHRI, Dosemeci et al., 2002), were calculated and then compared by means of the Spearman rank correlation test. Indicators were also compared with respect to key indicator characteristics, i.e. user friendliness and ability to represent the system under study. The comparison of the indicators in terms of the total environmental risk suggests that the indicators not relying on an ETR approach cannot be used as a reliable proxy for more complex, i.e. ETR, indicators. ETR indicators, when user-friendly, show a comparative advantage over non-ETR in best combining the need for a relatively simple tool to be used in contexts of limited data availability and resources, and for a reliable estimation of environmental risk. Non-ETR indicators remain useful and accessible tools to discriminate between different pesticides prior to application. Concerning the human health risk, simple algorithms seem more appropriate for assessing human health risk in LDCs. However, further research on health risk indicators and their validation under LDC conditions is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The discipline now called Solid State Nuclear Track Detection (SSNTD) dates back to 1958 and has its roots in the United Kingdom. Its strength stems chiefly from factors such as its simplicity, small geometry, permanent maintenance of the nuclear record and other diversified applications. A very important field with exciting applications reported recently in conjuction with the nuclear track technique is nanotechnology, which has applications in biology, chemistry, industry, medicare and health, information technology, biotechnology, and metallurgical and chemical technologies. Nanotechnology requires material design followed by the study of the quantum effects for final produced applications in sensors, medical diagnosis, information technology to name a few. We, in this article, present a review of past and present applications of SSNTD suggesting ways to apply the technique in nanotechnology, with special reference to development of nanostructure for applications utilising nanowires, nanofilters and sensors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gossip (or Epidemic) protocols have emerged as a communication and computation paradigm for large-scale networked systems. These protocols are based on randomised communication, which provides probabilistic guarantees on convergence speed and accuracy. They also provide robustness, scalability, computational and communication efficiency and high stability under disruption. This work presents a novel Gossip protocol named Symmetric Push-Sum Protocol for the computation of global aggregates (e.g., average) in decentralised and asynchronous systems. The proposed approach combines the simplicity of the push-based approach and the efficiency of the push-pull schemes. The push-pull schemes cannot be directly employed in asynchronous systems as they require synchronous paired communication operations to guarantee their accuracy. Although push schemes guarantee accuracy even with asynchronous communication, they suffer from a slower and unstable convergence. Symmetric Push- Sum Protocol does not require synchronous communication and achieves a convergence speed similar to the push-pull schemes, while keeping the accuracy stability of the push scheme. In the experimental analysis, we focus on computing the global average as an important class of node aggregation problems. The results have confirmed that the proposed method inherits the advantages of both other schemes and outperforms well-known state of the art protocols for decentralized Gossip-based aggregation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report presents the canonical Hamiltonian formulation of relative satellite motion. The unperturbed Hamiltonian model is shown to be equivalent to the well known Hill-Clohessy-Wilshire (HCW) linear formulation. The in°uence of perturbations of the nonlinear Gravitational potential and the oblateness of the Earth; J2 perturbations are also modelled within the Hamiltonian formulation. The modelling incorporates eccentricity of the reference orbit. The corresponding Hamiltonian vector ¯elds are computed and implemented in Simulink. A numerical method is presented aimed at locating periodic or quasi-periodic relative satellite motion. The numerical method outlined in this paper is applied to the Hamiltonian system. Although the orbits considered here are weakly unstable at best, in the case of eccentricity only, the method ¯nds exact periodic orbits. When other perturbations such as nonlinear gravitational terms are added, drift is signicantly reduced and in the case of the J2 perturbation with and without the nonlinear gravitational potential term, bounded quasi-periodic solutions are found. Advantages of using Newton's method to search for periodic or quasi-periodic relative satellite motion include simplicity of implementation, repeatability of solutions due to its non-random nature, and fast convergence. Given that the use of bounded or drifting trajectories as control references carries practical di±culties over long-term missions, Principal Component Analysis (PCA) is applied to the quasi-periodic or slowly drifting trajectories to help provide a closed reference trajectory for the implementation of closed loop control. In order to evaluate the e®ect of the quality of the model used to generate the periodic reference trajectory, a study involving closed loop control of a simulated master/follower formation was performed. 2 The results of the closed loop control study indicate that the quality of the model employed for generating the reference trajectory used for control purposes has an important in°uence on the resulting amount of fuel required to track the reference trajectory. The model used to generate LQR controller gains also has an e®ect on the e±ciency of the controller.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By eliminating the short range negative divergence of the Debye–Hückel pair distribution function, but retaining the exponential charge screening known to operate at large interparticle separation, the thermodynamic properties of one-component plasmas of point ions or charged hard spheres can be well represented even in the strong coupling regime. Predicted electrostatic free energies agree within 5% of simulation data for typical Coulomb interactions up to a factor of 10 times the average kinetic energy. Here, this idea is extended to the general case of a uniform ionic mixture, comprising an arbitrary number of components, embedded in a rigid neutralizing background. The new theory is implemented in two ways: (i) by an unambiguous iterative algorithm that requires numerical methods and breaks the symmetry of cross correlation functions; and (ii) by invoking generalized matrix inverses that maintain symmetry and yield completely analytic solutions, but which are not uniquely determined. The extreme computational simplicity of the theory is attractive when considering applications to complex inhomogeneous fluids of charged particles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The criticism of Jack London’s work has been dominated by a reliance upon ideas of the ‘real’, the ‘authentic’ and the ‘archetypal’. One of the figures in London’s work around which these ideas crystallize is that of the ‘wolf’. This article will examine the way the wolf is mobilized both in the criticism of Jack London’s work and in an example of the work: the novel White Fang (1906). This novel, though it has often been read as clearly delimiting and demarcating the realms of nature and culture, can be read conversely as unpicking the deceptive simplicity of such categories, as troubling essentialist notions of identity (human/animal, male/female, white/Indian) and as engaging with the complexity of the journey in which a ‘small animal … becomes human-sexual by crossing the infinite divide that separates life from humanity, the biological from the historical, “nature” from “culture” ’ (Althusser 1971: 206).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Solar TErrestrial RElations Observatory (STEREO) provides high cadence and high resolution images of the structure and morphology of coronal mass ejections (CMEs) in the inner heliosphere. CME directions and propagation speeds have often been estimated through the use of time-elongation maps obtained from the STEREO Heliospheric Imager (HI) data. Many of these CMEs have been identified by citizen scientists working within the SolarStormWatch project ( www.solarstormwatch.com ) as they work towards providing robust real-time identification of Earth-directed CMEs. The wide field of view of HI allows scientists to directly observe the two-dimensional (2D) structures, while the relative simplicity of time-elongation analysis means that it can be easily applied to many such events, thereby enabling a much deeper understanding of how CMEs evolve between the Sun and the Earth. For events with certain orientations, both the rear and front edges of the CME can be monitored at varying heliocentric distances (R) between the Sun and 1 AU. Here we take four example events with measurable position angle widths and identified by the citizen scientists. These events were chosen for the clarity of their structure within the HI cameras and their long track lengths in the time-elongation maps. We show a linear dependency with R for the growth of the radial width (W) and the 2D aspect ratio (χ) of these CMEs, which are measured out to ≈ 0.7 AU. We estimated the radial width from a linear best fit for the average of the four CMEs. We obtained the relationships W=0.14R+0.04 for the width and χ=2.5R+0.86 for the aspect ratio (W and R in units of AU).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article applies FIMIX-PLS segmentation methodology to detect and explore unanticipated reactions to organisational strategy among stakeholder segments. For many large organisations today, the tendency to apply a “one-size-fits-all” strategy to members of a stakeholder population, commonly driven by a desire for simplicity, efficiency and fairness, may actually result in unanticipated consequences amongst specific subgroups within the target population. This study argues that it is critical for organisations to understand the varying and potentially harmful effects of strategic actions across differing, and previously unidentified, segments within a stakeholder population. The case of a European revenue service that currently focuses its strategic actions on building trust and compliant behaviour amongst taxpayers is used as the context for this study. FIMIX-PLS analysis is applied to a sample of 501 individual taxpayers, while a novel PLS-based approach for assessing measurement model invariance that can be applied to both reflective and formative measures is also introduced for the purpose of multi-group comparisons. The findings suggest that individual taxpayers can be split into two equal-sized segments with highly differentiated characteristics and reactions to organisational strategy and communications. Compliant behaviour in the first segment (n = 223), labelled “relationships centred on trust,” is mainly driven through positive service experiences and judgements of competence, while judgements of benevolence lead to the unanticipated reaction of increasing distrust among this group. Conversely, compliant behaviour in the second segment (n = 278), labelled “relationships centred on distrust,” is driven by the reduction of fear and scepticism towards the revenue service, which is achieved through signalling benevolence, reduced enforcement and the lower incidence of negative stories. In this segment, the use of enforcement has the unanticipated and counterproductive effect of ultimately reducing compliant behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polycyclic aromatic hydrocarbons (PAHs) are ubiquitous environmental pollutants that frequently accumulate in soils. There is therefore a requirement to determine their levels in contaminated environments for the purposes of determining impacts on human health. PAHs are a suite of individual chemicals, and there is an ongoing debate as to the most appropriate method for assessing the risk to humans from them. Two methods predominate: the surrogate marker approach and the toxic equivalency factor. The former assumes that all chemicals in a mixture have an equivalent toxicity. The toxic equivalency approach estimates the potency of individual chemicals relative to the usually most toxic Benzo(a)pyrene. The surrogate marker approach is believed to overestimate risk and the toxic equivalency factor to underestimate risk. When analysing the risks from soils, the surrogate marker approach is preferred due to its simplicity, but there are concerns because of the potential diversity of the PAH profile across the range of impacted soils. Using two independent data sets containing soils from 274 sites across a diverse range of locations, statistical analysis was undertaken to determine the differences in the composition of carcinogenic PAH between site locations, for example, rural versus industrial. Following principal components analysis, distinct population differences were not seen between site locations in spite of large differences in the total PAH burden between individual sites. Using all data, highly significant correlations were seen between BaP and other carcinogenic PAH with the majority of r2 values > 0.8. Correlations with the European Food Standards Agency (EFSA) summed groups, that is, EFSA2, EFSA4 and EFSA8 had even higher correlations (r2 > 0.95). We therefore conclude that BaP is a suitable surrogate marker to represent mixtures of PAH in soil during risk assessments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A quasi-optical interferometric technique capable of measuring antenna phase patterns without the need for a heterodyne receiver is presented. It is particularly suited to the characterization of terahertz antennas feeding power detectors or mixers employing quasi-optical local oscillator injection. Examples of recorded antenna phase patterns at frequencies of 1.4 and 2.5 THz using homodyne detectors are presented. To our knowledge, these are the highest frequency antenna phase patterns ever recovered. Knowledge of both the amplitude and phase patterns in the far field enable a Gauss-Hermite or Gauss-Laguerre beam-mode analysis to be carried out for the antenna, of importance in performance optimization calculations, such as antenna gain and beam efficiency parameters at the design and prototype stage of antenna development. A full description of the beam would also be required if the antenna is to be used to feed a quasi-optical system in the near-field to far-field transition region. This situation could often arise when the device is fitted directly at the back of telescopes in flying observatories. A further benefit of the proposed technique is simplicity for characterizing systems in situ, an advantage of considerable importance as in many situations, the components may not be removable for further characterization once assembled. The proposed methodology is generic and should be useful across the wider sensing community, e.g., in single detector acoustic imaging or in adaptive imaging array applications. Furthermore, it is applicable across other frequencies of the EM spectrum, provided adequate spatial and temporal phase stability of the source can be maintained throughout the measurement process. Phase information retrieval is also of importance to emergent research areas, such as band-gap structure characterization, meta-materials research, electromagnetic cloaking, slow light, super-lens design as well as near-field and virtual imaging applications.