926 resultados para Approximate Bayesian computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ichthyosporea is a recently recognized group of morphologically simple eukaryotes, many of which cause disease in aquatic organisms. Ribosomal RNA sequence analyses place Ichthyosporea near the divergence of the animal and fungal lineages, but do not allow resolution of its exact phylogenetic position. Some of the best evidence for a specific grouping of animals and fungi (Opisthokonta) has come from elongation factor 1alpha, not only phylogenetic analysis of sequences but also the presence or absence of short insertions and deletions. We sequenced the EF-1alpha gene from the ichthyosporean parasite Ichthyophonus irregularis and determined its phylogenetic position using neighbor-joining, parsimony and Bayesian methods. We also sequenced EF-1alpha genes from four chytrids to provide broader representation within fungi. Sequence analyses and the presence of a characteristic 12 amino acid insertion strongly indicate that I. irregularis is a member of Opisthokonta, but do not resolve whether I. irregularis is a specific relative of animals or of fungi. However, the EF-1alpha of I. irregularis exhibits a two amino acid deletion heretofore reported only among fungi. (C) 2003 Elsevier Science (USA). All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new high throughput and scalable architecture for unified transform coding in H.264/AVC is proposed in this paper. Such flexible structure is capable of computing all the 4x4 and 2x2 transforms for Ultra High Definition Video (UHDV) applications (4320x7680@ 30fps) in real-time and with low hardware cost. These significantly high performance levels were proven with the implementation of several different configurations of the proposed structure using both FPGA and ASIC 90 nm technologies. In addition, such experimental evaluation also demonstrated the high area efficiency of theproposed architecture, which in terms of Data Throughput per Unit of Area (DTUA) is at least 1.5 times more efficient than its more prominent related designs(1).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Informática. Área de Especialização em Tecnologias do Conhecimento e Decisão.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Catastrophic events, such as wars and terrorist attacks, tornadoes and hurricanes, earthquakes, tsunamis, floods and landslides, are always accompanied by a large number of casualties. The size distribution of these casualties has separately been shown to follow approximate power law (PL) distributions. In this paper, we analyze the statistical distributions of the number of victims of catastrophic phenomena, in particular, terrorism, and find double PL behavior. This means that the data sets are better approximated by two PLs instead of a single one. We plot the PL parameters, corresponding to several events, and observe an interesting pattern in the charts, where the lines that connect each pair of points defining the double PLs are almost parallel to each other. A complementary data analysis is performed by means of the computation of the entropy. The results reveal relationships hidden in the data that may trigger a future comprehensive explanation of this type of phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphics processors were originally developed for rendering graphics but have recently evolved towards being an architecture for general-purpose computations. They are also expected to become important parts of embedded systems hardware -- not just for graphics. However, this necessitates the development of appropriate timing analysis techniques which would be required because techniques developed for CPU scheduling are not applicable. The reason is that we are not interested in how long it takes for any given GPU thread to complete, but rather how long it takes for all of them to complete. We therefore develop a simple method for finding an upper bound on the makespan of a group of GPU threads executing the same program and competing for the resources of a single streaming multiprocessor (whose architecture is based on NVIDIA Fermi, with some simplifying assunptions). We then build upon this method to formulate the derivation of the exact worst-case makespan (and corresponding schedule) as an optimization problem. Addressing the issue of tractability, we also present a technique for efficiently computing a safe estimate of the worstcase makespan with minimal pessimism, which may be used when finding an exact value would take too long.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider a wireless sensor network (WSN) where a broadcast from a sensor node does not reach all sensor nodes in the network; such networks are often called multihop networks. Sensor nodes take individual sensor readings, however, in many cases, it is relevant to compute aggregated quantities of these readings. In fact, the minimum and maximum of all sensor readings at an instant are often interesting because they indicate abnormal behavior, for example if the maximum temperature is very high then it may be that a fire has broken out. In this context, we propose an algorithm for computing the min or max of sensor readings in a multihop network. This algorithm has the particularly interesting property of having a time complexity that does not depend on the number of sensor nodes; only the network diameter and the range of the value domain of sensor readings matter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of deciding whether a set of n sporadic message streams meet deadlines on a Controller Area Network (CAN) bus for a specified priority assignment. It is assumed that message streams have implicit deadlines and no release jitter. An algorithm to solve this problem is well known but unfortunately it time complexity is non-polynomial. We present an algorithm with polynomial time-complexity for computing an upper bound on the response times. Clearly, if the upper bound on the response time does not exceed the deadline then all deadlines are met. The pessimism of our approach is proven: if the upper bound of the response time exceeds the deadline then the response time exceeds the deadline as well for a CAN network with half the speed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a brief history of the western music: from its genesis to serialism and the Darmstadt school. Also some mathematical aspects of music are then presented and confronted with music as a form of art. The question is, are these two distinct aspects compatible? Can computers be of real help in automatic composition? The more appealing algorithmic approach is evolutionary computation as it offers creativity potential. Therefore, the Evolutionary Algorithms are then introduced and some results of GAs and GPs application to music generation are analysed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When considering time series data of variables describing agent interactions in social neurobiological systems, measures of regularity can provide a global understanding of such system behaviors. Approximate entropy (ApEn) was introduced as a nonlinear measure to assess the complexity of a system behavior by quantifying the regularity of the generated time series. However, ApEn is not reliable when assessing and comparing the regularity of data series with short or inconsistent lengths, which often occur in studies of social neurobiological systems, particularly in dyadic human movement systems. Here, the authors present two normalized, nonmodified measures of regularity derived from the original ApEn, which are less dependent on time series length. The validity of the suggested measures was tested in well-established series (random and sine) prior to their empirical application, describing the dyadic behavior of athletes in team games. The authors consider one of the ApEn normalized measures to generate the 95th percentile envelopes that can be used to test whether a particular social neurobiological system is highly complex (i.e., generates highly unpredictable time series). Results demonstrated that suggested measures may be considered as valid instruments for measuring and comparing complexity in systems that produce time series with inconsistent lengths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the applicability of a reinforcement learning algorithm based on the application of the Bayesian theorem of probability. The proposed reinforcement learning algorithm is an advantageous and indispensable tool for ALBidS (Adaptive Learning strategic Bidding System), a multi-agent system that has the purpose of providing decision support to electricity market negotiating players. ALBidS uses a set of different strategies for providing decision support to market players. These strategies are used accordingly to their probability of success for each different context. The approach proposed in this paper uses a Bayesian network for deciding the most probably successful action at each time, depending on past events. The performance of the proposed methodology is tested using electricity market simulations in MASCEM (Multi-Agent Simulator of Competitive Electricity Markets). MASCEM provides the means for simulating a real electricity market environment, based on real data from real electricity market operators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Paper presented at Geo-Spatial Crossroad GI_Forum, Salzburg, Austria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider two Cournot firms, one located in the home country and the other in the foreign country, producing substitute goods for consumption in a third country. We suppose that neither the home government nor the foreign firm know the costs of the home firm, while the foreign firm cost is common knowledge. We determine the separating sequential equilibrium outputs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper revisits the convolution operator and addresses its generalization in the perspective of fractional calculus. Two examples demonstrate the feasibility of the concept using analytical expressions and the inverse Fourier transform, for real and complex orders. Two approximate calculation schemes in the time domain are also tested.