14 resultados para Complex Processes
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
El rápido crecimiento del los sistemas multicore y los diversos enfoques que estos han tomado, permiten que procesos complejos que antes solo eran posibles de ejecutar en supercomputadores, hoy puedan ser ejecutados en soluciones de bajo coste también denominadas "hardware de comodidad". Dichas soluciones pueden ser implementadas usando los procesadores de mayor demanda en el mercado de consumo masivo (Intel y AMD). Al escalar dichas soluciones a requerimientos de cálculo científico se hace indispensable contar con métodos para medir el rendimiento que los mismos ofrecen y la manera como los mismos se comportan ante diferentes cargas de trabajo. Debido a la gran cantidad de tipos de cargas existentes en el mercado, e incluso dentro de la computación científica, se hace necesario establecer medidas "típicas" que puedan servir como soporte en los procesos de evaluación y adquisición de soluciones, teniendo un alto grado de certeza de funcionamiento. En la presente investigación se propone un enfoque práctico para dicha evaluación y se presentan los resultados de las pruebas ejecutadas sobre equipos de arquitecturas multicore AMD e Intel.
Resumo:
It has been recently found that a number of systems displaying crackling noise also show a remarkable behavior regarding the temporal occurrence of successive events versus their size: a scaling law for the probability distributions of waiting times as a function of a minimum size is fulfilled, signaling the existence on those systems of self-similarity in time-size. This property is also present in some non-crackling systems. Here, the uncommon character of the scaling law is illustrated with simple marked renewal processes, built by definition with no correlations. Whereas processes with a finite mean waiting time do not fulfill a scaling law in general and tend towards a Poisson process in the limit of very high sizes, processes without a finite mean tend to another class of distributions, characterized by double power-law waiting-time densities. This is somehow reminiscent of the generalized central limit theorem. A model with short-range correlations is not able to escape from the attraction of those limit distributions. A discussion on open problems in the modeling of these properties is provided.
Resumo:
Aquesta memoria resumeix el treball de final de carrera d’Enginyeria Superior d’Informàtica. Explicarà les principals raons que han motivat el projecte així com exemples que il·lustren l’aplicació resultant. En aquest cas el software intentarà resoldre la actual necessitat que hi ha de tenir dades de Ground Truth per als algoritmes de segmentació de text per imatges de color complexes. Tots els procesos seran explicats en els diferents capítols partint de la definició del problema, la planificació, els requeriments i el disseny fins a completar la il·lustració dels resultats del programa i les dades de Ground Truth resultants.
Resumo:
The peace process in Northern Ireland demonstrates that new sovereignty formulas need to be explored in order to meet the demands of the populations and territories in conflict. The profound transformation of the classic symbolic elements of the nation-state within the context of the European Union has greatly contributed to the prospects for a resolution of this old conflict. Today’s discussions are focused on the search for instruments of shared sovereignty that are adapted to a complex and plural social reality. This new approach for finding a solution to the Irish conflict is particularly relevant to the Basque debate about formulating creative and modern solutions to similar conflicts over identity and sovereignty. The notion of shared sovereignty implemented in Northern Ireland –a formula for complex interdependent relations– is of significant relevance to the broader international community and is likely to become an increasingly potent and transcendent model for conflict resolution and peace building.
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
Most studies analysing the infrastructure impact on regional growth show a positive relationship between both variables. However, the public capital elasticity estimated in a Cobb-Douglas function, which is the most common specification in these works, is sometimes too big to be credible, so that the results have been partially desestimated. In the present paper, we give some new advances on the real link between public capital and productivity for the Spanish regions in the period 1964-1991. Firstly, we find out that the association for both variables is smaller when controlling for regional effects, being industry the sector which reaps the most benefits from an increase in the infrastructural dotation. Secondly, concerning to the rigidity of the Cobb-Douglas function, it is surpassed by using the variable expansion method. The expanded functional form reveals both the absence of a direct effect of infrastructure and the fact that the link between infrastructure and growth depends on the level of the existing stock (threshold level) and the way infrastructure is articulated in its location relative to other factors. Finally, we analyse the importance of the spatial dimension in infrastructure impact, due to spillover effects. In this sense, the paper provides evidence of the existence of spatial autocorrelation processes that may invalidate previous results.
Resumo:
Most studies analysing the infrastructure impact on regional growth show a positive relationship between both variables. However, the public capital elasticity estimated in a Cobb-Douglas function, which is the most common specification in these works, is sometimes too big to be credible, so that the results have been partially desestimated. In the present paper, we give some new advances on the real link between public capital and productivity for the Spanish regions in the period 1964-1991. Firstly, we find out that the association for both variables is smaller when controlling for regional effects, being industry the sector which reaps the most benefits from an increase in the infrastructural dotation. Secondly, concerning to the rigidity of the Cobb-Douglas function, it is surpassed by using the variable expansion method. The expanded functional form reveals both the absence of a direct effect of infrastructure and the fact that the link between infrastructure and growth depends on the level of the existing stock (threshold level) and the way infrastructure is articulated in its location relative to other factors. Finally, we analyse the importance of the spatial dimension in infrastructure impact, due to spillover effects. In this sense, the paper provides evidence of the existence of spatial autocorrelation processes that may invalidate previous results.
Resumo:
Extreme times techniques, generally applied to nonequilibrium statistical mechanical processes, are also useful for a better understanding of financial markets. We present a detailed study on the mean first-passage time for the volatility of return time series. The empirical results extracted from daily data of major indices seem to follow the same law regardless of the kind of index thus suggesting an universal pattern. The empirical mean first-passage time to a certain level L is fairly different from that of the Wiener process showing a dissimilar behavior depending on whether L is higher or lower than the average volatility. All of this indicates a more complex dynamics in which a reverting force drives volatility toward its mean value. We thus present the mean first-passage time expressions of the most common stochastic volatility models whose approach is comparable to the random diffusion description. We discuss asymptotic approximations of these models and confront them to empirical results with a good agreement with the exponential Ornstein-Uhlenbeck model.
Resumo:
Background Plant hormones play a pivotal role in several physiological processes during a plant's life cycle, from germination to senescence, and the determination of endogenous concentrations of hormones is essential to elucidate the role of a particular hormone in any physiological process. Availability of a sensitive and rapid method to quantify multiple classes of hormones simultaneously will greatly facilitate the investigation of signaling networks in controlling specific developmental pathways and physiological responses. Due to the presence of hormones at very low concentrations in plant tissues (10-9 M to 10-6 M) and their different chemistries, the development of a high-throughput and comprehensive method for the determination of hormones is challenging. Results The present work reports a rapid, specific and sensitive method using ultrahigh-performance liquid chromatography coupled to electrospray ionization tandem spectrometry (UPLC/ESI-MS/MS) to analyze quantitatively the major hormones found in plant tissues within six minutes, including auxins, cytokinins, gibberellins, abscisic acid, 1-amino-cyclopropane-1-carboxyic acid (the ethylene precursor), jasmonic acid and salicylic acid. Sample preparation, extraction procedures and UPLC-MS/MS conditions were optimized for the determination of all plant hormones and are summarized in a schematic extraction diagram for the analysis of small amounts of plant material without time-consuming additional steps such as purification, sample drying or re-suspension. Conclusions This new method is applicable to the analysis of dynamic changes in endogenous concentrations of hormones to study plant developmental processes or plant responses to biotic and abiotic stresses in complex tissues. An example is shown in which a hormone profiling is obtained from leaves of plants exposed to salt stress in the aromatic plant, Rosmarinus officinalis.
Resumo:
AbstractBACKGROUND: Scientists have been trying to understand the molecular mechanisms of diseases to design preventive and therapeutic strategies for a long time. For some diseases, it has become evident that it is not enough to obtain a catalogue of the disease-related genes but to uncover how disruptions of molecular networks in the cell give rise to disease phenotypes. Moreover, with the unprecedented wealth of information available, even obtaining such catalogue is extremely difficult.PRINCIPAL FINDINGS: We developed a comprehensive gene-disease association database by integrating associations from several sources that cover different biomedical aspects of diseases. In particular, we focus on the current knowledge of human genetic diseases including mendelian, complex and environmental diseases. To assess the concept of modularity of human diseases, we performed a systematic study of the emergent properties of human gene-disease networks by means of network topology and functional annotation analysis. The results indicate a highly shared genetic origin of human diseases and show that for most diseases, including mendelian, complex and environmental diseases, functional modules exist. Moreover, a core set of biological pathways is found to be associated with most human diseases. We obtained similar results when studying clusters of diseases, suggesting that related diseases might arise due to dysfunction of common biological processes in the cell.CONCLUSIONS: For the first time, we include mendelian, complex and environmental diseases in an integrated gene-disease association database and show that the concept of modularity applies for all of them. We furthermore provide a functional analysis of disease-related modules providing important new biological insights, which might not be discovered when considering each of the gene-disease association repositories independently. Hence, we present a suitable framework for the study of how genetic and environmental factors, such as drugs, contribute to diseases.AVAILABILITY: The gene-disease networks used in this study and part of the analysis are available at http://ibi.imim.es/DisGeNET/DisGeNETweb.html#Download
Resumo:
We uncover the global organization of clustering in real complex networks. To this end, we ask whether triangles in real networks organize as in maximally random graphs with given degree and clustering distributions, or as in maximally ordered graph models where triangles are forced into modules. The answer comes by way of exploring m-core landscapes, where the m-core is defined, akin to the k-core, as the maximal subgraph with edges participating in at least m triangles. This property defines a set of nested subgraphs that, contrarily to k-cores, is able to distinguish between hierarchical and modular architectures. We find that the clustering organization in real networks is neither completely random nor ordered although, surprisingly, it is more random than modular. This supports the idea that the structure of real networks may in fact be the outcome of self-organized processes based on local optimization rules, in contrast to global optimization principles.
Resumo:
In recent years, new analytical tools have allowed researchers to extract historical information contained in molecular data, which has fundamentally transformed our understanding of processes ruling biological invasions. However, the use of these new analytical tools has been largely restricted to studies of terrestrial organisms despite the growing recognition that the sea contains ecosystems that are amongst the most heavily affected by biological invasions, and that marine invasion histories are often remarkably complex. Here, we studied the routes of invasion and colonisation histories of an invasive marine invertebrate Microcosmus squamiger (Ascidiacea) using microsatellite loci, mitochondrial DNA sequence data and 11 worldwide populations. Discriminant analysis of principal components, clustering methods and approximate Bayesian computation (ABC) methods showed that the most likely source of the introduced populations was a single admixture event that involved populations from two genetically differentiated ancestral regions - the western and eastern coasts of Australia. The ABC analyses revealed that colonisation of the introduced range of M. squamiger consisted of a series of non-independent introductions along the coastlines of Africa, North America and Europe. Furthermore, we inferred that the sequence of colonisation across continents was in line with historical taxonomic records - first the Mediterranean Sea and South Africa from an unsampled ancestral population, followed by sequential introductions in California and, more recently, the NE Atlantic Ocean. We revealed the most likely invasion history for world populations of M. squamiger, which is broadly characterized by the presence of multiple ancestral sources and non-independent introductions within the introduced range. The results presented here illustrate the complexity of marine invasion routes and identify a cause-effect relationship between human-mediated transport and the success of widespread marine non-indigenous species, which benefit from stepping-stone invasions and admixture processes involving different sources for the spread and expansion of their range.
Resumo:
Introduction: Tuberous sclerosis complex (TSC) is a neurocutaneous syndrome produced by a number of genetic mutations. The disease is characterized by the development of benign tumors affecting different body systems. The most common oral manifestations of TSC are fibromas, gingival hyperplasia and enamel hypoplasia. Clinical Case: A 35-year-old woman diagnosed with TSC presented with a reactive fibroma of considerable size and rapid growth in the region of the right lower third molar. Discussion: In the present case the association of TSC with dental malpositioning gave rise to a rapidly evolving reactive fibroma of considerable diameter. Few similar cases can be found in the literature. Patients with TSC present mutations of the TSC1 and TSC2 genes, which intervene in cell cycle regulation and are important for avoiding neoplastic processes. No studies have been found associating TSC with an increased risk of oral cancer, though it has been shown that the over-expression of TSC2 could exert an antitumor effect. Careful oral and dental hygiene, together with regular visits to the dentist, are needed for the prevention and early detection of any type of oral lesion. The renal, pulmonary and cardiac alterations often seen in TSC must be taken into account for the correct management of these patients.