78 resultados para streaming SIMD extensions


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The fascinating idea that tools become extensions of our body appears in artistic, literary, philosophical, and scientific works alike. In the last fifteen years, this idea has been re-framed into several related hypotheses, one of which states that tool use extends the neural representation of the multisensory space immediately surrounding the hands (variously termed peripersonal space, peri-hand space, peri-cutaneous space, action space, or near space). This and related hypotheses have been tested extensively in the cognitive neurosciences, with evidence from molecular, neurophysiological, neuroimaging, neuropsychological, and behavioural fields. Here, I briefly review the evidence for and against the hypothesis that tool use extends a neural representation of the space surrounding the hand, concentrating on neurophysiological, neuropsychological, and behavioural evidence. I then provide a re-analysis of data from six published and one unpublished experiments using the crossmodal congruency task to test this hypothesis. While the re-analysis broadly confirms the previously-reported finding that tool use does not literally extend peripersonal space, the overall effect-sizes are small and statistical power is low. I conclude by questioning whether the crossmodal congruency task can indeed be used to test the hypothesis that tool use modifies peripersonal space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analyzes organic adoption decisions using a rich set of time-to-organic durations collected from avocado small-holders in Michoacán Mexico. We derive robust, intrasample predictions about the profiles of entry and exit within the conventional-versus-organic complex and we explore the sensitivity of these predictions to choice of functional form. The dynamic nature of the sample allows us to make retrospective predictions and we establish, precisely, the profile of organic entry had the respondents been availed optimal amounts of adoption-restraining resources. A fundamental problem in the dynamic adoption literature, hitherto unrecognized, is discussed and consequent extensions are suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of convective quasi–equilibrium (CQE) is a key ingredient in order to understand the role of deep moist convection in the atmosphere. It has been used as a guiding principle to develop almost all convective parameterizations and provides a basic theoretical framework for large–scale tropical dynamics. The CQE concept as originally proposed by Arakawa and Schubert [1974] is systematically reviewed from wider perspectives. Various interpretations and extensions of Arakawa and Schubert’s CQE are considered in terms of both a thermodynamic analogy and as a dynamical balance. The thermodynamic interpretations can be more emphatically embraced as a homeostasis. The dynamic balance interpretations can be best understood by analogy with the slow manifold. Various criticisms of CQE can be avoided by taking the dynamic balance interpretation. Possible limits of CQE are also discussed, including the importance of triggering in many convective situations, as well as the possible self–organized criticality of tropical convection. However, the most intriguing aspect of the CQE concept is that, in spite of many observational tests supporting and interpreting it in many different senses, it has 1never been established in a robust manner based on a systematic analysis of the cloud–work function budget by observations as was originally defined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lorenz’s theory of available p otential energy (APE) remains the main framework for studying the atmospheric and oceanic energy cycles. Because the APE generation rate is the volume integral of a thermodynamic efficiency times the local diabatic heating/cooling rate, APE theory is often regarded as an extension of the theory of heat engines. Available energetics in classical thermodynamics, however, usually relies on the concept of exergy, and is usually measured relative to a reference state maximising entropy at constant energy, whereas APE’s reference state minimises p otential energy at constant entropy. This review seeks to shed light on the two concepts; it covers local formulations of available energetics, alternative views of the dynamics/thermodynamics coupling, APE theory and the second law, APE production/dissipation, extensions to binary fluids, mean/eddy decomp ositions, APE in incompressible fluids, APE and irreversible turbulent mixing, and the role of mechanical forcing on APE production.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pardo, Patie, and Savov derived, under mild conditions, a Wiener-Hopf type factorization for the exponential functional of proper Lévy processes. In this paper, we extend this factorization by relaxing a finite moment assumption as well as by considering the exponential functional for killed Lévy processes. As a by-product, we derive some interesting fine distributional properties enjoyed by a large class of this random variable, such as the absolute continuity of its distribution and the smoothness, boundedness or complete monotonicity of its density. This type of results is then used to derive similar properties for the law of maxima and first passage time of some stable Lévy processes. Thus, for example, we show that for any stable process with $\rho\in(0,\frac{1}{\alpha}-1]$, where $\rho\in[0,1]$ is the positivity parameter and $\alpha$ is the stable index, then the first passage time has a bounded and non-increasing density on $\mathbb{R}_+$. We also generate many instances of integral or power series representations for the law of the exponential functional of Lévy processes with one or two-sided jumps. The proof of our main results requires different devices from the one developed by Pardo, Patie, Savov. It relies in particular on a generalization of a transform recently introduced by Chazal et al together with some extensions to killed Lévy process of Wiener-Hopf techniques. The factorizations developed here also allow for further applications which we only indicate here also allow for further applications which we only indicate here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pocket Data Mining (PDM) is our new term describing collaborative mining of streaming data in mobile and distributed computing environments. With sheer amounts of data streams are now available for subscription on our smart mobile phones, the potential of using this data for decision making using data stream mining techniques has now been achievable owing to the increasing power of these handheld devices. Wireless communication among these devices using Bluetooth and WiFi technologies has opened the door wide for collaborative mining among the mobile devices within the same range that are running data mining techniques targeting the same application. This paper proposes a new architecture that we have prototyped for realizing the significant applications in this area. We have proposed using mobile software agents in this application for several reasons. Most importantly the autonomic intelligent behaviour of the agent technology has been the driving force for using it in this application. Other efficiency reasons are discussed in details in this paper. Experimental results showing the feasibility of the proposed architecture are presented and discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Collaborative mining of distributed data streams in a mobile computing environment is referred to as Pocket Data Mining PDM. Hoeffding trees techniques have been experimentally and analytically validated for data stream classification. In this paper, we have proposed, developed and evaluated the adoption of distributed Hoeffding trees for classifying streaming data in PDM applications. We have identified a realistic scenario in which different users equipped with smart mobile devices run a local Hoeffding tree classifier on a subset of the attributes. Thus, we have investigated the mining of vertically partitioned datasets with possible overlap of attributes, which is the more likely case. Our experimental results have validated the efficiency of our proposed model achieving promising accuracy for real deployment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a conceptual framework for analyzing emerging agricultural hydrology problems in post-conflict Libya. Libya is one of the most arid regions on the planet. Thus, as well as substantial political and social changes, post-conflict Libyan administrators are confronted with important hydrological issues in Libya’s emerging water-landuse complex. This paper presents a substantial background to the water-land-use problem in Libya; reviews previous work in Libya and elsewhere on water-land-use issues and water-land-use conflicts in the dry and arid zones; outlines a conceptual framework for fruitful research interventions; and details the results of a survey conducted on Libyan farmers’ water usage, perceptions of emerging water-land-use conflicts and the appropriate value one should place on agricultural-use hydrological resources in Libya. Extensions are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The formation of complexes appearing in solutions containing oppositely charged polyelectrolytes has been investigated by Monte Carlo simulations using two different models. The polyions are described as flexible chains of 20 connected charged hard spheres immersed in a homogenous dielectric background representing water. The small ions are either explicitly included or their effect described by using a screened Coulomb potential. The simulated solutions contained 10 positively charged polyions with 0, 2, or 5 negatively charged polyions and the respective counterions. Two different linear charge densities were considered, and structure factors, radial distribution functions, and polyion extensions were determined. A redistribution of positively charged polyions involving strong complexes formed between the oppositely charged polyions appeared as the number of negatively charged polyions was increased. The nature of the complexes was found to depend on the linear charge density of the chains. The simplified model involving the screened Coulomb potential gave qualitatively similar results as the model with explicit small ions. Finally, owing to the complex formation, the sampling in configurational space is nontrivial, and the efficiency of different trial moves was examined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a method for describing the distribution of observed temperatures on any day of the year such that the distribution and summary statistics of interest derived from the distribution vary smoothly through the year. The method removes the noise inherent in calculating summary statistics directly from the data thus easing comparisons of distributions and summary statistics between different periods. The method is demonstrated using daily effective temperatures (DET) derived from observations of temperature and wind speed at De Bilt, Holland. Distributions and summary statistics are obtained from 1985 to 2009 and compared to the period 1904–1984. A two-stage process first obtains parameters of a theoretical probability distribution, in this case the generalized extreme value (GEV) distribution, which describes the distribution of DET on any day of the year. Second, linear models describe seasonal variation in the parameters. Model predictions provide parameters of the GEV distribution, and therefore summary statistics, that vary smoothly through the year. There is evidence of an increasing mean temperature, a decrease in the variability in temperatures mainly in the winter and more positive skew, more warm days, in the summer. In the winter, the 2% point, the value below which 2% of observations are expected to fall, has risen by 1.2 °C, in the summer the 98% point has risen by 0.8 °C. Medians have risen by 1.1 and 0.9 °C in winter and summer, respectively. The method can be used to describe distributions of future climate projections and other climate variables. Further extensions to the methodology are suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assumption that ‘states' primary goal is survival’ lies at the heart of the neorealist paradigm. A careful examination of the assumption, however, reveals that neorealists draw upon a number of distinct interpretations of the ‘survival assumption’ that are then treated as if they are the same, pointing towards conceptual problems that surround the treatment of state preferences. This article offers a specification that focuses on two questions that highlight the role and function of the survival assumption in the neorealist logic: (i) what do states have to lose if they fail to adopt self-help strategies?; and (ii) how does concern for relevant losses motivate state behaviour and affect international outcomes? Answering these questions through the exploration of governing elites' sensitivity towards regime stability and territorial integrity of the state, in turn, addresses the aforementioned conceptual problems. This specification has further implications for the debates among defensive and offensive realists, potential extensions of the neorealist logic beyond the Westphalian states, and the relationship between neorealist theory and policy analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe ncWMS, an implementation of the Open Geospatial Consortium’s Web Map Service (WMS) specification for multidimensional gridded environmental data. ncWMS can read data in a large number of common scientific data formats – notably the NetCDF format with the Climate and Forecast conventions – then efficiently generate map imagery in thousands of different coordinate reference systems. It is designed to require minimal configuration from the system administrator and, when used in conjunction with a suitable client tool, provides end users with an interactive means for visualizing data without the need to download large files or interpret complex metadata. It is also used as a “bridging” tool providing interoperability between the environmental science community and users of geographic information systems. ncWMS implements a number of extensions to the WMS standard in order to fulfil some common scientific requirements, including the ability to generate plots representing timeseries and vertical sections. We discuss these extensions and their impact upon present and future interoperability. We discuss the conceptual mapping between the WMS data model and the data models used by gridded data formats, highlighting areas in which the mapping is incomplete or ambiguous. We discuss the architecture of the system and particular technical innovations of note, including the algorithms used for fast data reading and image generation. ncWMS has been widely adopted within the environmental data community and we discuss some of the ways in which the software is integrated within data infrastructures and portals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brain activity can be measured non-invasively with functional imaging techniques. Each pixel in such an image represents a neural mass of about 105 to 107 neurons. Mean field models (MFMs) approximate their activity by averaging out neural variability while retaining salient underlying features, like neurotransmitter kinetics. However, MFMs incorporating the regional variability, realistic geometry and connectivity of cortex have so far appeared intractable. This lack of biological realism has led to a focus on gross temporal features of the EEG. We address these impediments and showcase a "proof of principle" forward prediction of co-registered EEG/fMRI for a full-size human cortex in a realistic head model with anatomical connectivity, see figure 1. MFMs usually assume homogeneous neural masses, isotropic long-range connectivity and simplistic signal expression to allow rapid computation with partial differential equations. But these approximations are insufficient in particular for the high spatial resolution obtained with fMRI, since different cortical areas vary in their architectonic and dynamical properties, have complex connectivity, and can contribute non-trivially to the measured signal. Our code instead supports the local variation of model parameters and freely chosen connectivity for many thousand triangulation nodes spanning a cortical surface extracted from structural MRI. This allows the introduction of realistic anatomical and physiological parameters for cortical areas and their connectivity, including both intra- and inter-area connections. Proper cortical folding and conduction through a realistic head model is then added to obtain accurate signal expression for a comparison to experimental data. To showcase the synergy of these computational developments, we predict simultaneously EEG and fMRI BOLD responses by adding an established model for neurovascular coupling and convolving "Balloon-Windkessel" hemodynamics. We also incorporate regional connectivity extracted from the CoCoMac database [1]. Importantly, these extensions can be easily adapted according to future insights and data. Furthermore, while our own simulation is based on one specific MFM [2], the computational framework is general and can be applied to models favored by the user. Finally, we provide a brief outlook on improving the integration of multi-modal imaging data through iterative fits of a single underlying MFM in this realistic simulation framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of discrete time filtering (intermittent data assimilation) for differential equation models and discuss methods for its numerical approximation. The focus is on methods based on ensemble/particle techniques and on the ensemble Kalman filter technique in particular. We summarize as well as extend recent work on continuous ensemble Kalman filter formulations, which provide a concise dynamical systems formulation of the combined dynamics-assimilation problem. Possible extensions to fully nonlinear ensemble/particle based filters are also outlined using the framework of optimal transportation theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work in graphic communication carried out by Otto Neurath and his associates – now commonly known simply as Isotype – has been the subject of much interest in recent years. Conceived and developed in the 1920s as ‘the Vienna method of pictorial statistics’, this approach to designing information had from its inception the power to grow and spread internationally. Political developments in Europe played their part in its development, and production moved to the Netherlands (1934) and to England (1940), where the Isotype Institute continued to produce work until 1971. Bringing together the latest research, this book is the first comprehensive, detailed account of its subject. The Austrian, Dutch, and English years of Isotype are described here freshly and extensively. There are chapters on the notable extensions of Isotype to Soviet Russia, the USA, and Africa. Isotype work in film and in designing for children is fully documented and discussed. Between these main chapters the book presents interludes documenting Isotype production visually. Three appendices reprint key documents. In its international coverage and its extensions into the wider terrain of history, this book opens a new vista in graphic design.