887 resultados para Sums of squares
Resumo:
Spink, S., Urquhart, C., Cox, A. & Higher Education Academy - Information and Computer Sciences Subject Centre. (2007). Procurement of electronic content across the UK National Health Service and Higher Education sectors. Report to JISC executive and LKDN executive. Sponsorship: JISC/LKDN
Resumo:
For pt. I see ibid., vol. 44, p. 927-36 (1997). In a digital communications system, data are transmitted from one location to another by mapping bit sequences to symbols, and symbols to sample functions of analog waveforms. The analog waveform passes through a bandlimited (possibly time-varying) analog channel, where the signal is distorted and noise is added. In a conventional system the analog sample functions sent through the channel are weighted sums of one or more sinusoids; in a chaotic communications system the sample functions are segments of chaotic waveforms. At the receiver, the symbol may be recovered by means of coherent detection, where all possible sample functions are known, or by noncoherent detection, where one or more characteristics of the sample functions are estimated. In a coherent receiver, synchronization is the most commonly used technique for recovering the sample functions from the received waveform. These sample functions are then used as reference signals for a correlator. Synchronization-based coherent receivers have advantages over noncoherent receivers in terms of noise performance, bandwidth efficiency (in narrow-band systems) and/or data rate (in chaotic systems). These advantages are lost if synchronization cannot be maintained, for example, under poor propagation conditions. In these circumstances, communication without synchronization may be preferable. The theory of conventional telecommunications is extended to chaotic communications, chaotic modulation techniques and receiver configurations are surveyed, and chaotic synchronization schemes are described
Resumo:
Correlated electron-ion dynamics (CEID) is an extension of molecular dynamics that allows us to introduce in a correct manner the exchange of energy between electrons and ions. The formalism is based on a systematic approximation: small amplitude moment expansion. This formalism is extended here to include the explicit quantum spread of the ions and a generalization of the Hartree-Fock approximation for incoherent sums of Slater determinants. We demonstrate that the resultant dynamical equations reproduce analytically the selection rules for inelastic electron-phonon scattering from perturbation theory, which control the mutually driven excitations of the two interacting subsystems. We then use CEID to make direct numerical simulations of inelastic current-voltage spectroscopy in atomic wires, and to exhibit the crossover from ionic cooling to heating as a function of the relative degree of excitation of the electronic and ionic subsystems.
Resumo:
Substantial sums of money are invested annually in preventative medicine and therapeutic treatment for people with a wide range of physical and psychological health problems, sometimes to no avail. There is now mounting evidence to suggest that companion animals, such as dogs and cats, can enhance the health of their human owners and may thus contribute significantly to the health expenditure of our country. This paper explores the evidence that pets can contribute to human health and well-being. The article initially concentrates on the value of animals for short- and long-term physical health, before exploring the relationship between animals and psychological health, focusing on the ability of dogs, cats, and other species to aid the disabled and serve as a "therapist" to those in institutional settings. The paper also discusses the evidence for the ability of dogs to facilitate the diagnosis and treatment of specific chronic diseases, notably cancer, epilepsy, and diabetes. Mechanisms underlying the ability of animals to promote human health are discussed within a theoretical framework. Whereas the evidence for a direct causal association between human well-being and companion animals is not conclusive, the literature reviewed is largely supportive of the widely held, and long-standing, belief that "pets are good for us." © 2009 The Society for the Psychological Study of Social Issues.
Resumo:
The estimation of animal abundance has a central role in wildlife management and research, including the role of badgers Meles meles in bovine tuberculosis transmission to cattle. This is the first study to examine temporal change in the badger population of Northern Ireland over amedium- to long-term time frame of 14-18 years by repeating a national survey first conducted during 1990-1993. A total of 212 1-km2 squares were surveyed during 2007-2008 and the number, type and activity of setts therein recorded. Badgers were widespread with 75% of squares containing at least one sett. The mean density of activemain setts,which was equivalent to badger social group density, was 0.56 (95%CI: 0.46-0.67) active main setts per km2 during 2007-2008. Social group density varied significantly among landclass groups and counties. The total number of social groups was estimated at 7,600 (95%CI: 6,200-9,000) and, not withstanding probable sources of error in estimating social group size, the total abundance of badgers was estimated to be 34,100 (95% CI: 26,200-42,000). There was no significant change in the badger population from that recorded during 1990-1993. A resource selection model provided a relative probability of sett construction at a spatial scale of 25m. Sett locations were negatively associated with elevation and positively associated with slope, aspect, soil sand content, the presence of cover, and the area of improved grassland and arable agriculture within 300 m.
Resumo:
Bit level systolic array structures for computing sums of products are studied in detail. It is shown that these can be sub-divided into two classes and that, within each class, architectures can be described in terms of a set of constraint equations. It is further demonstrated that high performance system level functions with attractive VLSI properties can be constructed by matching data flow geometries in bit level and word level architectures.
Resumo:
We undertake a detailed study of the sets of multiplicity in a second countable locally compact group G and their operator versions. We establish a symbolic calculus for normal completely bounded maps from the space B(L-2(G)) of bounded linear operators on L-2 (G) into the von Neumann algebra VN(G) of G and use it to show that a closed subset E subset of G is a set of multiplicity if and only if the set E* = {(s,t) is an element of G x G : ts(-1) is an element of E} is a set of operator multiplicity. Analogous results are established for M-1-sets and M-0-sets. We show that the property of being a set of multiplicity is preserved under various operations, including taking direct products, and establish an Inverse Image Theorem for such sets. We characterise the sets of finite width that are also sets of operator multiplicity, and show that every compact operator supported on a set of finite width can be approximated by sums of rank one operators supported on the same set. We show that, if G satisfies a mild approximation condition, pointwise multiplication by a given measurable function psi : G -> C defines a closable multiplier on the reduced C*-algebra G(r)*(G) of G if and only if Schur multiplication by the function N(psi): G x G -> C, given by N(psi)(s, t) = psi(ts(-1)), is a closable operator when viewed as a densely defined linear map on the space of compact operators on L-2(G). Similar results are obtained for multipliers on VN(C).
Resumo:
A periodic monitoring of the pavement condition facilitates a cost-effective distribution of the resources available for maintenance of the road infrastructure network. The task can be accurately carried out using profilometers, but such an approach is generally expensive. This paper presents a method to collect information on the road profile via accelerometers mounted in a fleet of non-specialist vehicles, such as police cars, that are in use for other purposes. It proposes an optimisation algorithm, based on Cross Entropy theory, to predict road irregularities. The Cross Entropy algorithm estimates the height of the road irregularities from vehicle accelerations at each point in time. To test the algorithm, the crossing of a half-car roll model is simulated over a range of road profiles to obtain accelerations of the vehicle sprung and unsprung masses. Then, the simulated vehicle accelerations are used as input in an iterative procedure that searches for the best solution to the inverse problem of finding road irregularities. In each iteration, a sample of road profiles is generated and an objective function defined as the sum of squares of differences between the ‘measured’ and predicted accelerations is minimized until convergence is reached. The reconstructed profile is classified according to ISO and IRI recommendations and compared to its original class. Results demonstrate that the approach is feasible and that a good estimate of the short-wavelength features of the road profile can be detected, despite the variability between the vehicles used to collect the data.
Resumo:
An upper bound for the sum of the squares of the entries of the principal eigenvector corresponding to a vertex subset inducing a k-regular subgraph is introduced and applied to the determination of an upper bound on the order of such induced subgraphs. Furthermore, for some connected graphs we establish a lower bound for the sum of squares of the entries of the principal eigenvector corresponding to the vertices of an independent set. Moreover, a spectral characterization of families of split graphs, involving its index and the entries of the principal eigenvector corresponding to the vertices of the maximum independent set is given. In particular, the complete split graph case is highlighted.
Resumo:
In his introduction, Pinna (2010) quoted one of Wertheimer’s observations: “I stand at the window and see a house, trees, sky. Theoretically I might say there were 327 brightnesses and nuances of color. Do I have ‘327’? No. I have sky, house, and trees.” This seems quite remarkable, for Max Wertheimer, together with Kurt Koffka and Wolfgang Koehler, was a pioneer of Gestalt Theory: perceptual organisation was tackled considering grouping rules of line and edge elements in relation to figure-ground segregation, i.e., a meaningful object (the figure) as perceived against a complex background (the ground). At the lowest level – line and edge elements – Wertheimer (1923) himself formulated grouping principles on the basis of proximity, good continuation, convexity, symmetry and, often forgotten, past experience of the observer. Rubin (1921) formulated rules for figure-ground segregation using surroundedness, size and orientation, but also convexity and symmetry. Almost a century of research into Gestalt later, Pinna and Reeves (2006) introduced the notion of figurality, meant to represent the integrated set of properties of visual objects, from the principles of grouping and figure-ground to the colour and volume of objects with shading. Pinna, in 2010, went one important step further and studied perceptual meaning, i.e., the interpretation of complex figures on the basis of past experience of the observer. Re-establishing a link to Wertheimer’s rule about past experience, he formulated five propositions, three definitions and seven properties on the basis of observations made on graphically manipulated patterns. For example, he introduced the illusion of meaning by comics-like elements suggesting wind, therefore inducing a learned interpretation. His last figure shows a regular array of squares but with irregular positions on the right side. This pile of (ir)regular squares can be interpreted as the result of an earthquake which destroyed part of an apartment block. This is much more intuitive, direct and economic than describing the complexity of the array of squares.
Resumo:
This note investigates the adequacy of the finite-sample approximation provided by the Functional Central Limit Theorem (FCLT) when the errors are allowed to be dependent. We compare the distribution of the scaled partial sums of some data with the distribution of the Wiener process to which it converges. Our setup is purposely very simple in that it considers data generated from an ARMA(1,1) process. Yet, this is sufficient to bring out interesting conclusions about the particular elements which cause the approximations to be inadequate in even quite large sample sizes.
Resumo:
Many unit root and cointegration tests require an estimate of the spectral density function at frequency zero at some process. Kernel estimators based on weighted sums of autocovariances constructed using estimated residuals from an AR(1) regression are commonly used. However, it is known that with substantially correlated errors, the OLS estimate of the AR(1) parameter is severely biased. in this paper, we first show that this least squares bias induces a significant increase in the bias and mean-squared error of kernel-based estimators.
Resumo:
Introduction. Fractal geometry measures the irregularity of abstract and natural objects with the fractal dimension. Fractal calculations have been applied to the structures of the human body and to quantifications in physiology from the theory of dynamic systems.Material and Methods. The fractal dimensions were calculated, the number of occupation spaces in the space border of box counting and the area of two red blood cells groups, 7 normal ones, group A, and 7 abnormal, group B, coming from patient and of bags for transfusion, were calculated using the method of box counting and a software developed for such effect. The obtained measures were compared, looking for differences between normal and abnormal red blood cells, with the purpose of differentiating samples.Results. The abnormality characterizes by a number of squares of occupation of the fractal space greater or equal to 180; values of areas between 25.117 and 33.548 correspond to normality. In case that the evaluation according to the number of pictures is of normality, must be confirmed with the value of the area applied to adjacent red blood cells within the sample, that in case of having values by outside established and/or the greater or equal spaces to 180, they suggest abnormality of the sample.Conclusions. The developed methodology is effective to differentiate the red globules alterations and probably useful in the analysis of bags of transfusion for clinical use
Experimental structure determination of the chemisorbed overlayers of chlorine and iodine on Au{111}
Resumo:
We have performed an experimental structure determination of the ordered p(sqrt[3] x sqrt[3])R30 degrees structures of chlorine and iodine on Au{111} using low-energy electron diffraction (LEED). Despite great similarities in the structure of the underlying substrate, which shows only minor deviations from the bulk positions in both cases, chlorine and iodine are found to adsorb in different adsorption sites, fcc and hcp hollow sites, respectively. The experimental Au-Cl and Au-I bond lengths of 2.56 and 2.84 A are close to the sums of the covalent radii, supporting the view that the bond is essentially covalent in nature; however, they are significantly shorter than predicted theoretically.
Resumo:
Concerns about potentially misleading reporting of pharmaceutical industry research have surfaced many times. The potential for duality (and thereby conflict) of interest is only too clear when you consider the sums of money required for the discovery, development and commercialization of new medicines. As the ability of major, mid-size and small pharmaceutical companies to innovate has waned, as evidenced by the seemingly relentless decline in the numbers of new medicines approved by Food and Drug Administration and European Medicines Agency year-on-year, not only has the cost per new approved medicine risen: so too has the public and media concern about the extent to which the pharmaceutical industry is open and honest about the efficacy, safety and quality of the drugs we manufacture and sell. In 2005 an Editorial in Journal of the American Medical Association made clear that, so great was their concern about misleading reporting of industry-sponsored studies, henceforth no article would be published that was not also guaranteed by independent statistical analysis. We examine the precursors to this Editorial, as well as its immediate and lasting effects for statisticians, for the manner in which statistical analysis is carried out, and for the industry more generally.