47 resultados para grafos aleatórios
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
This dissertation briefly presents the random graphs and the main quantities calculated from them. At the same time, basic thermodynamics quantities such as energy and temperature are associated with some of their characteristics. Approaches commonly used in Statistical Mechanics are employed and rules that describe a time evolution for the graphs are proposed in order to study their ergodicity and a possible thermal equilibrium between them
Resumo:
Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.
Resumo:
Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.
Resumo:
A new method to perform TCP/IP fingerprinting is proposed. TCP/IP fingerprinting is the process of identify a remote machine through a TCP/IP based computer network. This method has many applications related to network security. Both intrusion and defence procedures may use this process to achieve their objectives. There are many known methods that perform this process in favorable conditions. However, nowadays there are many adversities that reduce the identification performance. This work aims the creation of a new OS fingerprinting tool that bypass these actual problems. The proposed method is based on the use of attractors reconstruction and neural networks to characterize and classify pseudo-random numbers generators
Resumo:
Ising and m-vector spin-glass models are studied, in the limit of infinite-range in-teractions, through the replica method. First, the m-vector spin glass, in the presence of an external uniform magnetic field, as well as of uniaxial anisotropy fields, is consi-dered. The effects of the anisotropics on the phase diagrams, and in particular, on the Gabay-Toulouse line, which signals the transverse spin-glass ordering, are investigated. The changes in the Gabay-Toulouse line, due to the presence of anisotropy fields which favor spin orientations along the Cartesian axes (m = 2: planar anisotropy; m = 3: cubic anisotropy), are also studied. The antiferromagnetic Ising spin glass, in the presence of uniform and Gaussian random magnetic fields, is investigated through a two-sublattice generalization of the Sherrington-Kirpaktrick model. The effects of the magnetic-field randomness on the phase diagrams of the model are analysed. Some confrontations of the present results with experimental observations available in the literature are discussed
Resumo:
In this work we have studied the effects of random biquadratic and random fields in spin-glass models using the replica method. The effect of a random biquadratic coupling was studied in two spin-1 spin-glass models: in one case the interactions occur between pairs of spins, whereas in the second one the interactions occur between p spins and the limit p > oo is considered. Both couplings (spin glass and biquadratic) have zero-mean Gaussian probability distributions. In the first model, the replica-symmetric assumption reveals that the system presents two pha¬ses, namely, paramagnetic and spin-glass, separated by a continuous transition line. The stability analysis of the replica-symmetric solution yields, besides the usual instability associated with the spin-glass ordering, a new phase due to the random biquadratic cou¬plings between the spins. For the case p oo, the replica-symmetric assumption yields again only two phases, namely, paramagnetic and quadrupolar. In both these phases the spin-glass parameter is zero. Besides, it is shown that they are stable under the Almeida-Thouless stability analysis. One of them presents negative entropy at low temperatures. We developed one step of replica simmetry breaking and noticed that a new phase, the biquadratic glass phase, emerge. In this way we have obtained the correct phase diagram, with.three first-order transition lines. These lines merges in a common triple point. The effects of random fields were studied in the Sherrington-Kirkpatrick model consi¬dered in the presence of an external random magnetic field following a trimodal distribu¬tion {P{hi) = p+S(hi - h0) +Po${hi) +pS(hi + h0))- It is shown that the border of the ferromagnetic phase may present, for conveniently chosen values of p0 and hQ, first-order phase transitions, as well as tricritical points at finite temperatures. It is verified that the first-order phase transitions are directly related to the dilution in the fields: the extensions of these transitions are reduced for increasing values of po- In fact, the threshold value pg, above which all phase transitions are continuous, is calculated analytically. The stability analysis of the replica-symmetric solution is performed and the regions of validity of such a solution are identified
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Web services are software accessible via the Internet that provide functionality to be used by applications. Today, it is natural to reuse third-party services to compose new services. This process of composition can occur in two styles, called orchestration and choreography. A choreography represents a collaboration between services which know their partners in the composition, to achieve the service s desired functionality. On the other hand, an orchestration have a central process (the orchestrator) that coordinates all application operations. Our work is placed in this latter context, by proposing an abstract model for running service orchestrations. For this purpose, a graph reduction machine will be defined for the implementation of service orchestrations specified in a variant of the PEWS composition language. Moreover, a prototype of this machine (in Java) is built as a proof of concept
Resumo:
Web services are software units that allow access to one or more resources, supporting the deployment of business processes in the Web. They use well-defined interfaces, using web standard protocols, making possible the communication between entities implemented on different platforms. Due to these features, Web services can be integrated as services compositions to form more robust loose coupling applications. Web services are subject to failures, unwanted situations that may compromise the business process partially or completely. Failures can occur both in the design of compositions as in the execution of compositions. As a result, it is essential to create mechanisms to make the implementation of service compositions more robust and to treat failures. Specifically, we propose the support for fault recovery in service compositions described in PEWS language and executed on PEWS-AM, an graph reduction machine. To support recovery failure on PEWS-AM, we extend the PEWS language specification and adapted the rules of translation and reduction of graphs for this machine. These contributions were made both in the model of abstract machine as at the implementation level
Resumo:
One of the mechanisms responsible for the anomalous diffusion is the existence of long-range temporal correlations, for example, Fractional Brownian Motion and walk models according to Elephant memory and Alzheimer profiles, whereas in the latter two cases the walker can always "remember" of his first steps. The question to be elucidated, and the was the main motivation of our work, is if memory of the historic initial is condition for observation anomalous diffusion (in this case, superdiffusion). We give a conclusive answer, by studying a non-Markovian model in which the walkers memory of the past, at time t, is given by a Gaussian centered at time t=2 and standard deviation t which grows linearly as the walker ages. For large widths of we find that the model behaves similarly to the Elephant model; In the opposite limit (! 0), although the walker forget the early days, we observed similar results to the Alzheimer walk model, in particular the presence of amnestically induced persistence, characterized by certain log-periodic oscillations. We conclude that the memory of earlier times is not a necessary condition for the generating of superdiffusion nor the amnestically induced persistence and can appear even in profiles of memory that forgets the initial steps, like the Gausssian memory profile investigated here.
Resumo:
Hebb proposed that synapses between neurons that fire synchronously are strengthened, forming cell assemblies and phase sequences. The former, on a shorter scale, are ensembles of synchronized cells that function transiently as a closed processing system; the latter, on a larger scale, correspond to the sequential activation of cell assemblies able to represent percepts and behaviors. Nowadays, the recording of large neuronal populations allows for the detection of multiple cell assemblies. Within Hebb’s theory, the next logical step is the analysis of phase sequences. Here we detected phase sequences as consecutive assembly activation patterns, and then analyzed their graph attributes in relation to behavior. We investigated action potentials recorded from the adult rat hippocampus and neocortex before, during and after novel object exploration (experimental periods). Within assembly graphs, each assembly corresponded to a node, and each edge corresponded to the temporal sequence of consecutive node activations. The sum of all assembly activations was proportional to firing rates, but the activity of individual assemblies was not. Assembly repertoire was stable across experimental periods, suggesting that novel experience does not create new assemblies in the adult rat. Assembly graph attributes, on the other hand, varied significantly across behavioral states and experimental periods, and were separable enough to correctly classify experimental periods (Naïve Bayes classifier; maximum AUROCs ranging from 0.55 to 0.99) and behavioral states (waking, slow wave sleep, and rapid eye movement sleep; maximum AUROCs ranging from 0.64 to 0.98). Our findings agree with Hebb’s view that neuronal assemblies correspond to primitive building blocks of representation, nearly unchanged in 10 the adult, while phase sequences are labile across behavioral states and change after novel experience. The results are compatible with a role for phase sequences in behavior and cognition
Resumo:
Graph Reduction Machines, are a traditional technique for implementing functional programming languages. They allow to run programs by transforming graphs by the successive application of reduction rules. Web service composition enables the creation of new web services from existing ones. BPEL is a workflow-based language for creating web service compositions. It is also the industrial and academic standard for this kind of languages. As it is designed to compose web services, the use of BPEL in a scenario where multiple technologies need to be used is problematic: when operations other than web services need to be performed to implement the business logic of a company, part of the work is done on an ad hoc basis. To allow heterogeneous operations to be part of the same workflow, may help to improve the implementation of business processes in a principled way. This work uses a simple variation of the BPEL language for creating compositions containing not only web service operations but also big data tasks or user-defined operations. We define an extensible graph reduction machine that allows the evaluation of BPEL programs and implement this machine as proof of concept. We present some experimental results.
Resumo:
The objective is to analyze the relationship between risk and number of stocks of a portfolio for an individual investor when stocks are chosen by "naive strategy". For this, we carried out an experiment in which individuals select actions to reproduce this relationship. 126 participants were informed that the risk of first choice would be an asset average of all standard deviations of the portfolios consist of a single asset, and the same procedure should be used for portfolios composed of two, three and so on, up to 30 actions . They selected the assets they want in their portfolios without the support of a financial analysis. For comparison we also tested a hypothetical simulation of 126 investors who selected shares the same universe, through a random number generator. Thus, each real participant is compensated for random hypothetical investor facing the same opportunity. Patterns were observed in the portfolios of individual participants, characterizing the curves for the components of the samples. Because these groupings are somewhat arbitrary, it was used a more objective measure of behavior: a simple linear regression for each participant, in order to predict the variance of the portfolio depending on the number of assets. In addition, we conducted a pooled regression on all observations by analyzing cross-section. The result of pattern occurs on average but not for most individuals, many of which effectively "de-diversify" when adding seemingly random bonds. Furthermore, the results are slightly worse using a random number generator. This finding challenges the belief that only a small number of titles is necessary for diversification and shows that there is only applicable to a large sample. The implications are important since many individual investors holding few stocks in their portfolios
Resumo:
This work aims to investigate the relationship between the entrepreneurship and the incidence of bureaucratic corruption in the states of Brazil and Federal District. The main hypothesis of this study is that the opening of a business in Brazilian states is negatively affected by the incidence of corruption. The theoretical reference is divided into Entrepreneurship and bureaucratic corruption, with an emphasis on materialistic perspective (objectivist) of entrepreneurship and the effects of bureaucratic corruption on entrepreneurial activity. By the regression method with panel data, we estimated the models with pooled data and fixed and random effects. To measure corruption, I used the General Index of Corruption for the Brazilian states (BOLL, 2010), and to represent entrepreneurship, firm entry per capita by state. Tests (Chow, Hausman and Breusch-Pagan) indicate that the random effects model is more appropriate, and the preliminary results indicate a positive impact of bureaucratic corruption on entrepreneurial activity, contradicting the hypothesis expected and found in previous articles to Brazil, and corroborating the proposition of Dreher and Gassebner (2011) that, in countries with high regulation, bureaucratic corruption can be grease in the wheels of entrepreneurship