628 resultados para Algorismes paral·lels
Resumo:
The major objective of this thesis is to describe and analyse how a railcarrier is engaged in an intermodal freight transportation network through its role and position. Because of the fact that the role as a conceptualisation has a lot of parallels with the position, both these phenomena are evaluated theoretically and empirically. VR Cargo (a strategical business unitof the Finnish railway company VR Ltd.) was chosen to be the focal firm surrounded by the actors of the focal net. Because of the fact that networks are sets of relationships rather than sets of actors, it is essential to describe the dimensions of the relationships created through the time thus having a past, presentand future. The roles are created during long common history shared by the actors especially when IM networks are considered. The presence of roles is embeddedin the tasks, and the future is anchored to the expectations. Furthermore, in this study role refers to network dynamics, and to incremental and radical changes in the network, in a similar way as position refers to stability and to the influences of bonded structures. The main purpose of the first part of the study was to examine how the two distinctive views that have a dominant position in modern logistics ¿ the network view (particularly IMP-based network approach) and the managerial view (represented by Supply Chain Management) differ, especially when intermodalism is under consideration. In this study intermodalism was defined as a form of interorganisational behaviour characterized by the physical movement of unitized goods with Intermodal Transport Units, using more than one mode as performed by the net of operators. In this particular stage the study relies mainly on theoretical evaluation broadened by some discussions with the practitioners. This is essential, because the continuous dialogue between theory and practice is highly emphasized. Some managerial implications are discussed on the basis of the theoretical examination. A tentative model for empirical analysis in subsequent research is suggested. The empirical investigation, which relies on the interviews among the members in the focal net, shows that the major role of the focal company in the network is the common carrier. This role has some behavioural and functional characteristics, such as an executive's disclosure expressing strategic will attached with stable and predictable managerial and organisational behaviour. Most important is the notion that the focal company is neutral for all the other operators, and willing to enhance and strengthen the collaboration with all the members in the IM network. This also means that all the accounts are aimed at being equal in terms of customer satisfaction. Besides, the adjustments intensify the adopted role. However, the focal company is also obliged tosustain its role as it still has a government-erected right to maintain solely the railway operations on domestic tracks. In addition, the roles of a dominator, principal, partner, subcontractor, and integrator were present appearing either in a dyadic relationship or in net(work) context. In order to reveal differentroles, a dualistic interpretation of the concept of role/position was employed.
Resumo:
Neuronal dynamics are fundamentally constrained by the underlying structural network architecture, yet much of the details of this synaptic connectivity are still unknown even in neuronal cultures in vitro. Here we extend a previous approach based on information theory, the Generalized Transfer Entropy, to the reconstruction of connectivity of simulated neuronal networks of both excitatory and inhibitory neurons. We show that, due to the model-free nature of the developed measure, both kinds of connections can be reliably inferred if the average firing rate between synchronous burst events exceeds a small minimum frequency. Furthermore, we suggest, based on systematic simulations, that even lower spontaneous inter-burst rates could be raised to meet the requirements of our reconstruction algorithm by applying a weak spatially homogeneous stimulation to the entire network. By combining multiple recordings of the same in silico network before and after pharmacologically blocking inhibitory synaptic transmission, we show then how it becomes possible to infer with high confidence the excitatory or inhibitory nature of each individual neuron.
Resumo:
En els darrers anys, la criptografia amb corbes el.líptiques ha adquirit una importància creixent, fins a arribar a formar part en la actualitat de diferents estàndards industrials. Tot i que s'han dissenyat variants amb corbes el.líptiques de criptosistemes clàssics, com el RSA, el seu màxim interès rau en la seva aplicació en criptosistemes basats en el Problema del Logaritme Discret, com els de tipus ElGamal. En aquest cas, els criptosistemes el.líptics garanteixen la mateixa seguretat que els construïts sobre el grup multiplicatiu d'un cos finit primer, però amb longituds de clau molt menor. Mostrarem, doncs, les bones propietats d'aquests criptosistemes, així com els requeriments bàsics per a que una corba sigui criptogràficament útil, estretament relacionat amb la seva cardinalitat. Revisarem alguns mètodes que permetin descartar corbes no criptogràficament útils, així com altres que permetin obtenir corbes bones a partir d'una de donada. Finalment, descriurem algunes aplicacions, com són el seu ús en Targes Intel.ligents i sistemes RFID, per concloure amb alguns avenços recents en aquest camp.
Resumo:
Background: Parallel T-Coffee (PTC) was the first parallel implementation of the T-Coffee multiple sequence alignment tool. It is based on MPI and RMA mechanisms. Its purpose is to reduce the execution time of the large-scale sequence alignments. It can be run on distributed memory clusters allowing users to align data sets consisting of hundreds of proteins within a reasonable time. However, most of the potential users of this tool are not familiar with the use of grids or supercomputers. Results: In this paper we show how PTC can be easily deployed and controlled on a super computer architecture using a web portal developed using Rapid. Rapid is a tool for efficiently generating standardized portlets for a wide range of applications and the approach described here is generic enough to be applied to other applications, or to deploy PTC on different HPC environments. Conclusions: The PTC portal allows users to upload a large number of sequences to be aligned by the parallel version of TC that cannot be aligned by a single machine due to memory and execution time constraints. The web portal provides a user-friendly solution.
Resumo:
This article introduces a new interface for T-Coffee, a consistency-based multiple sequence alignment program. This interface provides an easy and intuitive access to the most popular functionality of the package. These include the default T-Coffee mode for protein and nucleic acid sequences, the M-Coffee mode that allows combining the output of any other aligners, and template-based modes of T-Coffee that deliver high accuracy alignments while using structural or homology derived templates. These three available template modes are Expresso for the alignment of protein with a known 3D-Structure, R-Coffee to align RNA sequences with conserved secondary structures and PSI-Coffee to accurately align distantly related sequences using homology extension. The new server benefits from recent improvements of the T-Coffee algorithm and can align up to 150 sequences as long as 10 000 residues and is available from both http://www.tcoffee.org and its main mirror http://tcoffee.crg.cat.
Resumo:
We present a new branch and bound algorithm for weighted Max-SAT, called Lazy which incorporates original data structures and inference rules, as well as a lower bound of better quality. We provide experimental evidence that our solver is very competitive and outperforms some of the best performing Max-SAT and weighted Max-SAT solvers on a wide range of instances.
Resumo:
The goal of this work is to try to create a statistical model, based only on easily computable parameters from the CSP problem to predict runtime behaviour of the solving algorithms, and let us choose the best algorithm to solve the problem. Although it seems that the obvious choice should be MAC, experimental results obtained so far show, that with big numbers of variables, other algorithms perfom much better, specially for hard problems in the transition phase.
Resumo:
The analysis of the shape of excitation-emission matrices (EEMs) is a relevant tool for exploring the origin, transport and fate of dissolved organic matter (DOM) in aquatic ecosystems. Within this context, the decomposition of EEMs is acquiring a notable relevance. A simple mathematical algorithm that automatically deconvolves individual EEMs is described, creating new possibilities for the comparison of DOM fluorescence properties and EEMs that are very different from each other. A mixture model approach is adopted to decompose complex surfaces into sub-peaks. The laplacian operator and the Nelder-Mead optimisation algorithm are implemented to individuate and automatically locate potential peaks in the EEM landscape. The EEMs of a simple artificial mixture of fluorophores and DOM samples collected in a Mediterranean river are used to describe the model application and to illustrate a strategy that optimises the search for the optimal output.
Resumo:
Langevin Equations of Ginzburg-Landau form, with multiplicative noise, are proposed to study the effects of fluctuations in domain growth. These equations are derived from a coarse-grained methodology. The Cahn-Hiliard-Cook linear stability analysis predicts some effects in the transitory regime. We also derive numerical algorithms for the computer simulation of these equations. The numerical results corroborate the analytical predictions of the linear analysis. We also present simulation results for spinodal decomposition at large times.
Resumo:
In this paper we design and develop several filtering strategies for the analysis of data generated by a resonant bar gravitational wave (GW) antenna, with the goal of assessing the presence (or absence) therein of long-duration monochromatic GW signals, as well as the eventual amplitude and frequency of the signals, within the sensitivity band of the detector. Such signals are most likely generated in the fast rotation of slightly asymmetric spinning stars. We develop practical procedures, together with a study of their statistical properties, which will provide us with useful information on the performance of each technique. The selection of candidate events will then be established according to threshold-crossing probabilities, based on the Neyman-Pearson criterion. In particular, it will be shown that our approach, based on phase estimation, presents a better signal-to-noise ratio than does pure spectral analysis, the most common approach.
Resumo:
In this paper, we show how business model modelling can be connected to IT infrastructure, drawing parallels from enterprise architecture models such as ArchiMate. We then show how the proposed visualization based on enterprise architecture, with a strong focus on business model strategy, can help IT alignment, at both the business model and the IT infrastructure level.
Resumo:
BACKGROUND: Hybridization between incipient species is expected to become progressively limited as their genetic divergence increases and reproductive isolation proceeds. Amphibian radiations and their secondary contact zones are useful models to infer the timeframes of speciation, but empirical data from natural systems remains extremely scarce. Here we follow this approach in the European radiation of tree frogs (Hyla arborea group). We investigated a natural hybrid zone between two lineages (Hyla arborea and Hyla orientalis) of Mio-Pliocene divergence (~5 My) for comparison with other hybrid systems from this group. RESULTS: We found concordant geographic distributions of nuclear and mitochondrial gene pools, and replicated narrow transitions (~30 km) across two independent transects, indicating an advanced state of reproductive isolation and potential local barriers to dispersal. This result parallels the situation between H. arborea and H. intermedia, which share the same amount of divergence with H. orientalis. In contrast, younger lineages show much stronger admixture at secondary contacts. CONCLUSIONS: Our findings corroborate the negative relationship between hybridizability and divergence time in European tree frogs, where 5 My are necessary to achieve almost complete reproductive isolation. Speciation seems to progress homogeneously in this radiation, and might thus be driven by gradual genome-wide changes rather than single speciation genes. However, the timescale differs greatly from that of other well-studied amphibians. General assumptions on the time necessary for speciation based on evidence from unrelated taxa may thus be unreliable. In contrast, comparative hybrid zone analyses within single radiations such as our case study are useful to appreciate the advance of speciation in space and time.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.
Resumo:
El objetivo de estos programas es crear una herramienta que nos permita, de una manera fácil, entender mejor la separación de fuentes y la deconvolución de canal . Por eso se presenta el diseño, mediante Java, de una página web [1]: http//www.uvic.es/projectes/SeparationSources con carácter marcadamente didáctico para el estudio y evaluación de diferentes algoritmos propuestos en la bibliografía.