790 resultados para Multi-Dimensional Random Walk
Resumo:
While others have attempted to determine, by way of mathematical formulae, optimal resource duplication strategies for random walk protocols, this paper is concerned with studying the emergent effects of dynamic resource propagation and replication. In particular, we show, via modelling and experimentation, that under any given decay (purge) rate the number of nodes that have knowledge of particular resource converges to a fixed point or a limit cycle. We also show that even for high rates of decay - that is, when few nodes have knowledge of a particular resource - the number of hops required to find that resource is small.
Resumo:
Molecular transport in phase space is crucial for chemical reactions because it defines how pre-reactive molecular configurations are found during the time evolution of the system. Using Molecular Dynamics (MD) simulated atomistic trajectories we test the assumption of the normal diffusion in the phase space for bulk water at ambient conditions by checking the equivalence of the transport to the random walk model. Contrary to common expectations we have found that some statistical features of the transport in the phase space differ from those of the normal diffusion models. This implies a non-random character of the path search process by the reacting complexes in water solutions. Our further numerical experiments show that a significant long period of non-stationarity in the transition probabilities of the segments of molecular trajectories can account for the observed non-uniform filling of the phase space. Surprisingly, the characteristic periods in the model non-stationarity constitute hundreds of nanoseconds, that is much longer time scales compared to typical lifetime of known liquid water molecular structures (several picoseconds).
Resumo:
Sentiment analysis has long focused on binary classification of text as either positive or negative. There has been few work on mapping sentiments or emotions into multiple dimensions. This paper studies a Bayesian modeling approach to multi-class sentiment classification and multidimensional sentiment distributions prediction. It proposes effective mechanisms to incorporate supervised information such as labeled feature constraints and document-level sentiment distributions derived from the training data into model learning. We have evaluated our approach on the datasets collected from the confession section of the Experience Project website where people share their life experiences and personal stories. Our results show that using the latent representation of the training documents derived from our approach as features to build a maximum entropy classifier outperforms other approaches on multi-class sentiment classification. In the more difficult task of multi-dimensional sentiment distributions prediction, our approach gives superior performance compared to a few competitive baselines. © 2012 ACM.
Resumo:
The concept of data independence designates the techniques that allow data to be changed without affecting the applications that process it. The different structures of the information bases require corresponded tools for supporting data independence. A kind of information bases (the Multi-dimensional Numbered Information Spaces) are pointed in the paper. The data independence in such information bases is discussed.
Resumo:
In the article it is considered preconditions and main principles of creation of virtual laboratories for computer-aided design, as tools for interdisciplinary researches. Virtual laboratory, what are offered, is worth to be used on the stage of the requirements specification or EFT-stage, because it gives the possibility of fast estimating of the project realization, certain characteristics and, as a result, expected benefit of its applications. Using of these technologies already increase automation level of design stages of new devices for different purposes. Proposed computer technology gives possibility to specialists from such scientific fields, as chemistry, biology, biochemistry, physics etc, to check possibility of device creating on the basis of developed sensors. It lets to reduce terms and costs of designing of computer devices and systems on the early stages of designing, for example on the stage of requirements specification or EFT-stage. An important feature of this project is using the advanced multi-dimensional access method for organizing the information base of the Virtual laboratory.
Resumo:
Dissolved organic matter (DOM) in groundwater and surface water samples from the Florida coastal Everglades were studied using excitation–emission matrix fluorescence modeled through parallel factor analysis (EEM-PARAFAC). DOM in both surface and groundwater from the eastern Everglades S332 basin reflected a terrestrial-derived fingerprint through dominantly higher abundances of humic-like PARAFAC components. In contrast, surface water DOM from northeastern Florida Bay featured a microbial-derived DOM signature based on the higher abundance of microbial humic-like and protein-like components consistent with its marine source. Surprisingly, groundwater DOM from northeastern Florida Bay reflected a terrestrial-derived source except for samples from central Florida Bay well, which mirrored a combination of terrestrial and marine end-member origin. Furthermore, surface water and groundwater displayed effects of different degradation pathways such as photodegradation and biodegradation as exemplified by two PARAFAC components seemingly indicative of such degradation processes. Finally, Principal Component Analysis of the EEM-PARAFAC data was able to distinguish and classify most of the samples according to DOM origins and degradation processes experienced, except for a small overlap of S332 surface water and groundwater, implying rather active surface-to-ground water interaction in some sites particularly during the rainy season. This study highlights that EEM-PARAFAC could be used successfully to trace and differentiate DOM from diverse sources across both horizontal and vertical flow profiles, and as such could be a convenient and useful tool for the better understanding of hydrological interactions and carbon biogeochemical cycling.
Resumo:
Text summarization has been studied for over a half century, but traditional methods process texts empirically and neglect the fundamental characteristics and principles of language use and understanding. Automatic summarization is a desirable technique for processing big data. This reference summarizes previous text summarization approaches in a multi-dimensional category space, introduces a multi-dimensional methodology for research and development, unveils the basic characteristics and principles of language use and understanding, investigates some fundamental mechanisms of summarization, studies dimensions on representations, and proposes a multi-dimensional evaluation mechanism. Investigation extends to incorporating pictures into summary and to the summarization of videos, graphs and pictures, and converges to a general summarization method. Further, some basic behaviors of summarization are studied in the complex cyber-physical-social space. Finally, a creative summarization mechanism is proposed as an effort toward the creative summarization of things, which is an open process of interactions among physical objects, data, people, and systems in cyber-physical-social space through a multi-dimensional lens of semantic computing. The insights can inspire research and development of many computing areas.
Resumo:
This symposium describes a multi-dimensional strategy to examine fidelity of implementation in an authentic school district context. An existing large-district peer mentoring program provides an example. The presentation will address development of a logic model to articulate a theory of change; collaborative creation of a data set aligned with essential concepts and research questions; identification of independent, dependent, and covariate variables; issues related to use of big data that include conditioning and transformation of data prior to analysis; operationalization of a strategy to capture fidelity of implementation data from all stakeholders; and ways in which fidelity indicators might be used.
Resumo:
Models of neutrino-driven core-collapse supernova explosions have matured considerably in recent years. Explosions of low-mass progenitors can routinely be simulated in 1D, 2D, and 3D. Nucleosynthesis calculations indicate that these supernovae could be contributors of some lighter neutron-rich elements beyond iron. The explosion mechanism of more massive stars remains under investigation, although first 3D models of neutrino-driven explosions employing multi-group neutrino transport have become available. Together with earlier 2D models and more simplified 3D simulations, these have elucidated the interplay between neutrino heating and hydrodynamic instabilities in the post-shock region that is essential for shock revival. However, some physical ingredients may still need to be added/improved before simulations can robustly explain supernova explosions over a wide range of progenitors. Solutions recently suggested in the literature include uncertainties in the neutrino rates, rotation, and seed perturbations from convective shell burning. We review the implications of 3D simulations of shell burning in supernova progenitors for the ‘perturbations-aided neutrino-driven mechanism,’ whose efficacy is illustrated by the first successful multi-group neutrino hydrodynamics simulation of an 18 solar mass progenitor with 3D initial conditions. We conclude with speculations about the impact of 3D effects on the structure of massive stars through convective boundary mixing.
Resumo:
US suburbs have often been characterized by their relatively low walk accessibility compared to more urban environments, and US urban environments have been characterized by low walk accessibility compared to cities in other countries. Lower overall density in the suburbs implies that activities, if spread out, would have a greater distance between them. But why should activities be spread out instead of developed contiguously? This brief research note builds a positive model for the emergence of contiguous development along “Main Street” to illustrate the trade-offs that result in the built environment we observe. It then suggests some policy interventions to place a “thumb on the scale” to choose which parcels will develop in which sequence to achieve socially preferred outcomes.
Resumo:
Before the rise of the Multidimentional Protein Identification Technology (MudPIT), protein and peptide mixtures were resolved using traditional proteomic technologies like the gel-‐ based 2D chromatography that separates proteins by isoelectric point and molecular weight. This technique was tedious and limited, since the characterization of single proteins required isolation of protein gel spots, their subsequent proteolyzation and analysis using Matrix-‐ assisted laser desorption/ionization-‐time of flight (MALDI-‐TOF) mass spectrometry.
Resumo:
Mestrado em Finanças