40 resultados para FEC using Reed-Solomon-like codes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the major problems when using non-dedicated volunteer resources in adistributed network is the high volatility of these hosts since they can go offlineor become unavailable at any time without control. Furthermore, the use ofvolunteer resources implies some security issues due to the fact that they aregenerally anonymous entities which we know nothing about. So, how to trustin someone we do not know?.Over the last years an important number of reputation-based trust solutionshave been designed to evaluate the participants' behavior in a system.However, most of these solutions are addressed to P2P and ad-hoc mobilenetworks that may not fit well with other kinds of distributed systems thatcould take advantage of volunteer resources as recent cloud computinginfrastructures.In this paper we propose a first approach to design an anonymous reputationmechanism for CoDeS [1], a middleware for building fogs where deployingservices using volunteer resources. The participants are reputation clients(RC), a reputation authority (RA) and a certification authority (CA). Users needa valid public key certificate from the CA to register to the RA and obtain thedata needed to participate into the system, as now an opaque identifier thatwe call here pseudonym and an initial reputation value that users provide toother users when interacting together. The mechanism prevents not only themanipulation of the provided reputation values but also any disclosure of theusers' identities to any other users or authorities so the anonymity isguaranteed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, there are several services and applications that allow users to locate and move to different tourist areas using a mobile device. These systems can be used either by internet or downloading an application in concrete places like a visitors centre. Although such applications are able to facilitate the location and the search for points of interest, in most cases, these services and applications do not meet the needs of each user. This paper aims to provide a solution by studying the main projects, services and applications, their routing algorithms and their treatment of the real geographical data in Android mobile devices, focusing on the data acquisition and treatment to improve the routing searches in off-line environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HEMOLIA (a project under European community’s 7th framework programme) is a new generation Anti-Money Laundering (AML) intelligent multi-agent alert and investigation system which in addition to the traditional financial data makes extensive use of modern society’s huge telecom data source, thereby opening up a new dimension of capabilities to all Money Laundering fighters (FIUs, LEAs) and Financial Institutes (Banks, Insurance Companies, etc.). This Master-Thesis project is done at AIA, one of the partners for the HEMOLIA project in Barcelona. The objective of this thesis is to find the clusters in a network drawn by using the financial data. An extensive literature survey has been carried out and several standard algorithms related to networks have been studied and implemented. The clustering problem is a NP-hard problem and several algorithms like K-Means and Hierarchical clustering are being implemented for studying several problems relating to sociology, evolution, anthropology etc. However, these algorithms have certain drawbacks which make them very difficult to implement. The thesis suggests (a) a possible improvement to the K-Means algorithm, (b) a novel approach to the clustering problem using the Genetic Algorithms and (c) a new algorithm for finding the cluster of a node using the Genetic Algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To evaluate the suitability of an improved version of an automatic segmentation method based on geodesic active regions (GAR) for segmenting cerebral vasculature with aneurysms from 3D X-ray reconstruc-tion angiography (3DRA) and time of °ight magnetic resonance angiography (TOF-MRA) images available in the clinical routine.Methods: Three aspects of the GAR method have been improved: execution time, robustness to variability in imaging protocols and robustness to variability in image spatial resolutions. The improved GAR was retrospectively evaluated on images from patients containing intracranial aneurysms in the area of the Circle of Willis and imaged with two modalities: 3DRA and TOF-MRA. Images were obtained from two clinical centers, each using di®erent imaging equipment. Evaluation included qualitative and quantitative analyses ofthe segmentation results on 20 images from 10 patients. The gold standard was built from 660 cross-sections (33 per image) of vessels and aneurysms, manually measured by interventional neuroradiologists. GAR has also been compared to an interactive segmentation method: iso-intensity surface extraction (ISE). In addition, since patients had been imaged with the two modalities, we performed an inter-modality agreement analysis with respect to both the manual measurements and each of the two segmentation methods. Results: Both GAR and ISE di®ered from the gold standard within acceptable limits compared to the imaging resolution. GAR (ISE, respectively) had an average accuracy of 0.20 (0.24) mm for 3DRA and 0.27 (0.30) mm for TOF-MRA, and had a repeatability of 0.05 (0.20) mm. Compared to ISE, GAR had a lower qualitative error in the vessel region and a lower quantitative error in the aneurysm region. The repeatabilityof GAR was superior to manual measurements and ISE. The inter-modality agreement was similar between GAR and the manual measurements. Conclusions: The improved GAR method outperformed ISE qualitatively as well as quantitatively and is suitable for segmenting 3DRA and TOF-MRA images from clinical routine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Error-correcting codes and matroids have been widely used in the study of ordinary secret sharing schemes. In this paper, the connections between codes, matroids, and a special class of secret sharing schemes, namely, multiplicative linear secret sharing schemes (LSSSs), are studied. Such schemes are known to enable multiparty computation protocols secure against general (nonthreshold) adversaries.Two open problems related to the complexity of multiplicative LSSSs are considered in this paper. The first one deals with strongly multiplicative LSSSs. As opposed to the case of multiplicative LSSSs, it is not known whether there is an efficient method to transform an LSSS into a strongly multiplicative LSSS for the same access structure with a polynomial increase of the complexity. A property of strongly multiplicative LSSSs that could be useful in solving this problem is proved. Namely, using a suitable generalization of the well-known Berlekamp–Welch decoder, it is shown that all strongly multiplicative LSSSs enable efficient reconstruction of a shared secret in the presence of malicious faults. The second one is to characterize the access structures of ideal multiplicative LSSSs. Specifically, the considered open problem is to determine whether all self-dual vector space access structures are in this situation. By the aforementioned connection, this in fact constitutes an open problem about matroid theory, since it can be restated in terms of representability of identically self-dual matroids by self-dual codes. A new concept is introduced, the flat-partition, that provides a useful classification of identically self-dual matroids. Uniform identically self-dual matroids, which are known to be representable by self-dual codes, form one of the classes. It is proved that this property also holds for the family of matroids that, in a natural way, is the next class in the above classification: the identically self-dual bipartite matroids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple-input multiple-output (MIMO) techniques have become an essential part of broadband wireless communications systems. For example, the recently developed IEEE 802.16e specifications for broadband wireless access include three MIMOprofiles employing 2×2 space-time codes (STCs), and two of these MIMO schemes are mandatory on the downlink of Mobile WiMAX systems. One of these has full rate, and the other has full diversity, but neither of them has both of the desired features. The third profile, namely, Matrix C, which is not mandatory, is both a full rate and a full diversity code, but it has a high decoder complexity. Recently, the attention was turned to the decodercomplexity issue and including this in the design criteria, several full-rate STCs were proposed as alternatives to Matrix C. In this paper, we review these different alternatives and compare them to Matrix C in terms of performances and the correspondingreceiver complexities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have reported in a variety of mammalian cells the reversible formation of a filamentous actin (F-actin)-enriched aggresome generated by the actin toxin jasplakinolide (Lázaro-Diéguez et al., J Cell Sci 2008; 121:1415-25). Notably, this F-actin aggresome (FAG) resembles in many aspects the pathological Hirano body, which frequently appears in some diseases such as Alzheimer's and alcoholism. Using selective inhibitors, we examined the molecular and subcellular mechanisms that participate in the clearance of the FAG. Chaperones, microtubules, proteasomes and autophagosomes all actively participate to eliminate the FAG. Here we compile and compare these results and discuss the involvement of each process. Because of its simplicity and high reproducibility, our cellular model could help to test pharmacological agents designed to interfere with the mechanisms involved in the clearance of intracellular bodies and, in particular, of those enriched in F-actin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a heuristic method for learning error correcting output codes matrices based on a hierarchical partition of the class space that maximizes a discriminative criterion. To achieve this goal, the optimal codeword separation is sacrificed in favor of a maximum class discrimination in the partitions. The creation of the hierarchical partition set is performed using a binary tree. As a result, a compact matrix with high discrimination power is obtained. Our method is validated using the UCI database and applied to a real problem, the classification of traffic sign images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In luminescence dating, the potassium concentration significantly contributes to the dose rate value in the age estimation. Within this study, fine-grain thermoluminescence dating has been applied on sherds of calcareous pottery of known age, excavated at a Roman site in Mallorca, Spain. For those of the samples that showed signs of severe potassium leaching, according to chemical and mineralogical examination, the thermoluminescence analysis provided overestimated dates. By using the known archaeological age of the samples, a corrected dose rate value can be estimated which provides the potassium concentration averaged for the burial period. Finally, a step-like model can then be used to estimate the fraction of the burial period after which most of the alteration effects took place.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper highlights the role of non-functional information when reusing from a component library. We describe a method for selecting appropriate implementations of Ada packages taking non-functional constraints into account; these constraints model the context of reuse. Constraints take the form of queries using an interface description language called NoFun, which is also used to state non-functional information in Ada packages; query results are trees of implementations, following the import relationships between components. We define two different situations when reusing components, depending whether we take the library being searched as closed or extendible. The resulting tree of implementations can be manipulated by the user to solve ambiguities, to state default behaviours, and by the like. As part of the proposal, we face the problem of computing from code the non-functional information that determines the selection process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, a number of zoonotic flaviviruses have emerged worldwide, and wild birds serve as their major reservoirs. Epidemiological surveys of bird populations at various geographical scales can clarify key aspects of the eco-epidemiology of these viruses. In this study, we aimed at exploring the presence of flaviviruses in the western Mediterranean by sampling breeding populations of the yellow-legged gull (Larus michahellis), a widely distributed, anthropophilic, and abundant seabird species. For 3 years, we sampled eggs from 19 breeding colonies in Spain, France, Algeria, and Tunisia. First, ELISAs were used to determine if the eggs contained antibodies against flaviviruses. Second, neutralization assays were used to identify the specific flaviviruses present. Finally, for colonies in which ELISA-positive eggs had been found, chick serum samples and potential vectors, culicid mosquitoes and soft ticks (Ornithodoros maritimus), were collected and analyzed using serology and PCR, respectively. The prevalence of flavivirus-specific antibodies in eggs was highly spatially heterogeneous. In northeastern Spain, on the Medes Islands and in the nearby village of L'Escala, 56% of eggs had antibodies against the flavivirus envelope protein, but were negative for neutralizing antibodies against three common flaviviruses: West Nile, Usutu, and tick-borne encephalitis viruses. Furthermore, little evidence of past flavivirus exposure was obtained for the other colonies. A subset of the Ornithodoros ticks from Medes screened for flaviviral RNA tested positive for a virus whose NS5 gene was 95% similar to that of Meaban virus, a flavivirus previously isolated from ticks of Larus argentatus in western France. All ELISA-positive samples subsequently tested positive for Meaban virus neutralizing antibodies. This study shows that gulls in the western Mediterranean Basin are exposed to a tick-borne Meaban-like virus, which underscores the need of exploring the spatial and temporal distribution of this flavivirus as well as its potential pathogenicity for animals and humans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0