940 resultados para 010300 NUMERICAL AND COMPUTATIONAL MATHEMATICS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Low-copy-number molecules are involved in many functions in cells. The intrinsic fluctuations of these numbers can enable stochastic switching between multiple steady states, inducing phenotypic variability. Herein we present a theoretical and computational study based on Master Equations and Fokker-Planck and Langevin descriptions of stochastic switching for a genetic circuit of autoactivation. We show that in this circuit the intrinsic fluctuations arising from low-copy numbers, which are inherently state-dependent, drive asymmetric switching. These theoretical results are consistent with experimental data that have been reported for the bistable system of the gallactose signaling network in yeast. Our study unravels that intrinsic fluctuations, while not required to describe bistability, are fundamental to understand stochastic switching and the dynamical relative stability of multiple states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoituksena on kerätä yhteen tiedot kaikista maailmalta löytyvistä ison LOCA:n ulospuhallusvaiheen tutkimiseen käytetyistä koelaitteistoista. Työn tarkoituksena on myös antaa pohjaa päätökselle, onko tarpeellista rakentaa uusi koelaitteisto nesterakenne-vuorovaikutuskoodien laskennan validoimista varten. Ennen varsinaisen koelaitteiston rakentamista olisi tarkoituksenmukaista myös rakentaa pienempi pilottikoelaitteisto, jolla voitaisiin testata käytettäviä mittausmenetelmiä. Sopivaa mittausdataa tarvitaan uusien CFD-koodien ja rakenneanalyysikoodien kytketyn laskennan validoimisessa. Näitä koodeja voidaan käyttää esimerkiksi arvioitaessa reaktorin sisäosien rakenteellista kestävyyttä ison LOCA:n ulospuhallusvaiheen aikana. Raportti keskittyy maailmalta löytyviin koelaitteistoihin, uuden koelaitteiston suunnitteluperusteisiin sekä aiheeseen liittyviin yleisiin asioihin. Raportti ei korvaa olemassa olevia validointimatriiseja, mutta sitä voi käyttää apuna etsittäessä validointitarkoituksiin sopivaa ison LOCA:n ulospuhallusvaiheen koelaitteistoa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The detailed in-vivo characterization of subcortical brain structures is essential not only to understand the basic organizational principles of the healthy brain but also for the study of the involvement of the basal ganglia in brain disorders. The particular tissue properties of basal ganglia - most importantly their high iron content, strongly affect the contrast of magnetic resonance imaging (MRI) images, hampering the accurate automated assessment of these regions. This technical challenge explains the substantial controversy in the literature about the magnitude, directionality and neurobiological interpretation of basal ganglia structural changes estimated from MRI and computational anatomy techniques. My scientific project addresses the pertinent need for accurate automated delineation of basal ganglia using two complementary strategies: ? Empirical testing of the utility of novel imaging protocols to provide superior contrast in the basal ganglia and to quantify brain tissue properties; ? Improvement of the algorithms for the reliable automated detection of basal ganglia and thalamus Previous research demonstrated that MRI protocols based on magnetization transfer (MT) saturation maps provide optimal grey-white matter contrast in subcortical structures compared with the widely used Tl-weighted (Tlw) images (Helms et al., 2009). Under the assumption of a direct impact of brain tissue properties on MR contrast my first study addressed the question of the mechanisms underlying the regional specificities effect of the basal ganglia. I used established whole-brain voxel-based methods to test for grey matter volume differences between MT and Tlw imaging protocols with an emphasis on subcortical structures. I applied a regression model to explain the observed grey matter differences from the regionally specific impact of brain tissue properties on the MR contrast. The results of my first project prompted further methodological developments to create adequate priors for the basal ganglia and thalamus allowing optimal automated delineation of these structures in a probabilistic tissue classification framework. I established a standardized workflow for manual labelling of the basal ganglia, thalamus and cerebellar dentate to create new tissue probability maps from quantitative MR maps featuring optimal grey-white matter contrast in subcortical areas. The validation step of the new tissue priors included a comparison of the classification performance with the existing probability maps. In my third project I continued investigating the factors impacting automated brain tissue classification that result in interpretational shortcomings when using Tlw MRI data in the framework of computational anatomy. While the intensity in Tlw images is predominantly

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we study the field of opinion mining by giving a comprehensive review of the available research that has been done in this topic. Also using this available knowledge we present a case study of a multilevel opinion mining system for a student organization's sales management system. We describe the field of opinion mining by discussing its historical roots, its motivations and applications as well as the different scientific approaches that have been used to solve this challenging problem of mining opinions. To deal with this huge subfield of natural language processing, we first give an abstraction of the problem of opinion mining and describe the theoretical frameworks that are available for dealing with appraisal language. Then we discuss the relation between opinion mining and computational linguistics which is a crucial pre-processing step for the accuracy of the subsequent steps of opinion mining. The second part of our thesis deals with the semantics of opinions where we describe the different ways used to collect lists of opinion words as well as the methods and techniques available for extracting knowledge from opinions present in unstructured textual data. In the part about collecting lists of opinion words we describe manual, semi manual and automatic ways to do so and give a review of the available lists that are used as gold standards in opinion mining research. For the methods and techniques of opinion mining we divide the task into three levels that are the document, sentence and feature level. The techniques that are presented in the document and sentence level are divided into supervised and unsupervised approaches that are used to determine the subjectivity and polarity of texts and sentences at these levels of analysis. At the feature level we give a description of the techniques available for finding the opinion targets, the polarity of the opinions about these opinion targets and the opinion holders. Also at the feature level we discuss the various ways to summarize and visualize the results of this level of analysis. In the third part of our thesis we present a case study of a sales management system that uses free form text and that can benefit from an opinion mining system. Using the knowledge gathered in the review of this field we provide a theoretical multi level opinion mining system (MLOM) that can perform most of the tasks needed from an opinion mining system. Based on the previous research we give some hints that many of the laborious market research tasks that are done by the sales force, which uses this sales management system, can improve their insight about their partners and by that increase the quality of their sales services and their overall results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proposed transdisciplinary field of ‘complexics’ would bring together allcontemporary efforts in any specific disciplines or by any researchersspecifically devoted to constructing tools, procedures, models and conceptsintended for transversal application that are aimed at understanding andexplaining the most interwoven and dynamic phenomena of reality. Our aimneeds to be, as Morin says, not “to reduce complexity to simplicity, [but] totranslate complexity into theory”.New tools for the conception, apprehension and treatment of the data ofexperience will need to be devised to complement existing ones and toenable us to make headway toward practices that better fit complexictheories. New mathematical and computational contributions have alreadycontinued to grow in number, thanks primarily to scholars in statisticalphysics and computer science, who are now taking an interest in social andeconomic phenomena.Certainly, these methodological innovations put into question and againmake us take note of the excessive separation between the training receivedby researchers in the ‘sciences’ and in the ‘arts’. Closer collaborationbetween these two subsets would, in all likelihood, be much moreenergising and creative than their current mutual distance. Humancomplexics must be seen as multi-methodological, insofar as necessarycombining quantitative-computation methodologies and more qualitativemethodologies aimed at understanding the mental and emotional world ofpeople.In the final analysis, however, models always have a narrative runningbehind them that reflects the attempts of a human being to understand theworld, and models are always interpreted on that basis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proposed transdisciplinary field of ‘complexics’ would bring together allcontemporary efforts in any specific disciplines or by any researchersspecifically devoted to constructing tools, procedures, models and conceptsintended for transversal application that are aimed at understanding andexplaining the most interwoven and dynamic phenomena of reality. Our aimneeds to be, as Morin says, not “to reduce complexity to simplicity, [but] totranslate complexity into theory”.New tools for the conception, apprehension and treatment of the data ofexperience will need to be devised to complement existing ones and toenable us to make headway toward practices that better fit complexictheories. New mathematical and computational contributions have alreadycontinued to grow in number, thanks primarily to scholars in statisticalphysics and computer science, who are now taking an interest in social andeconomic phenomena.Certainly, these methodological innovations put into question and againmake us take note of the excessive separation between the training receivedby researchers in the ‘sciences’ and in the ‘arts’. Closer collaborationbetween these two subsets would, in all likelihood, be much moreenergising and creative than their current mutual distance. Humancomplexics must be seen as multi-methodological, insofar as necessarycombining quantitative-computation methodologies and more qualitativemethodologies aimed at understanding the mental and emotional world ofpeople.In the final analysis, however, models always have a narrative runningbehind them that reflects the attempts of a human being to understand theworld, and models are always interpreted on that basis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neural signal processing is a discipline within neuroengineering. This interdisciplinary approach combines principles from machine learning, signal processing theory, and computational neuroscience applied to problems in basic and clinical neuroscience. The ultimate goal of neuroengineering is a technological revolution, where machines would interact in real time with the brain. Machines and brains could interface, enabling normal function in cases of injury or disease, brain monitoring, and/or medical rehabilitation of brain disorders. Much current research in neuroengineering is focused on understanding the coding and processing of information in the sensory and motor systems, quantifying how this processing is altered in the pathological state, and how it can be manipulated through interactions with artificial devices including brain–computer interfaces and neuroprosthetics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrogen content in natural gas was studied in experimental and computational investigations to identify its influence on the emission level of exhaust gases from combustion facilities. Changes in natural gas composition with different N2 concentrations may result from introducing a new source gas into the system. An industrial burner fired at 75 kW, housed in a laboratory-scale furnace, was employed for runs where the natural gas/N2 proportion was varied. The exhaust and in-furnace measurements of temperature and gas concentrations were performed for different combustion scenarios, varying N2 content from 1-10 %v. Results have shown that the contamination of natural gas with nitrogen reduced the peak flame temperature, the concentration of unstable species, the NO X emission level and the heat transfer rate to the furnace walls, resulting from the recombination reactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prodigiosin and obatoclax, members of the prodiginines family, are small molecules with anti-cancer properties that are currently under preclinical and clinical trials. The molecular target(s) of these agents, however, is an open question. Combining experimental and computational techniques we find that prodigiosin binds to the BH3 domain in some BCL-2 protein families, which play an important role in the apoptotic programmed cell death. In particular, our results indicate a large affinity of prodigiosin for MCL-1, an anti-apoptotic member of the BCL-2 family. In melanoma cells, we demonstrate that prodigiosin activates the mitochondrial apoptotic pathway by disrupting MCL-1/BAK complexes. Computer simulations with the PELE software allow the description of the induced fit process, obtaining a detailed atomic view of the molecular interactions. These results provide new data to understand the mechanism of action of these molecules, and assist in the development of more specific inhibitors of anti-apoptotic BCL-2 proteins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Robotic platforms have advanced greatly in terms of their remote sensing capabilities, including obtaining optical information using cameras. Alongside these advances, visual mapping has become a very active research area, which facilitates the mapping of areas inaccessible to humans. This requires the efficient processing of data to increase the final mosaic quality and computational efficiency. In this paper, we propose an efficient image mosaicing algorithm for large area visual mapping in underwater environments using multiple underwater robots. Our method identifies overlapping image pairs in the trajectories carried out by the different robots during the topology estimation process, being this a cornerstone for efficiently mapping large areas of the seafloor. We present comparative results based on challenging real underwater datasets, which simulated multi-robot mapping

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Environmental concern is growing in the current days and there is global agreement to banish production and use of persistent organic pollutants (POP). The synthetic insecticides chlordecone and mirex, classified as POPs, have similar structures and they are potentially toxic. This work uses properties and physicochemical constants related to the pesticides and computational simulation to evaluate the leach phenomenon and persistency in soil. The largest tendency of persistence of the compound is found to be in the surface of soil, but even low concentration in water represents a high risk due to bioaccumulation in adipose tissue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, it has been shown that the speed of virus infections can be explained by time-delayed reactiondiffusion [J. Fort and V. Me´ndez, Phys. Rev. Lett. 89, 178101 (2002)], but no analytical solutions were found. Here we derive formulas for the front speed, valid in appropriate limits. We also integrate numerically the evolution equations of the system. There is good agreement with both numerical and experimental speeds

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the relative equilibria of the limit case of the pla- nar Newtonian 4{body problem when three masses tend to zero, the so-called (1 + 3){body problem. Depending on the values of the in- nitesimal masses the number of relative equilibria varies from ten to fourteen. Always six of these relative equilibria are convex and the oth- ers are concave. Each convex relative equilibrium of the (1 + 3){body problem can be continued to a unique family of relative equilibria of the general 4{body problem when three of the masses are su ciently small and every convex relative equilibrium for these masses belongs to one of these six families.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a detailed guide for the application of computational electrochemistry is presented. The basic framework of the electrochemical models and their computational solutions are described. We highlighted that the availability of commercial software allows application of the technique by experimentalists with minimal mathematical and computational expertise. The most used packages are indicated. Simulations of typical examples are presented and some references cited to illustrate the wide applicability of computational electrochemistry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work is to demonstrate the efficient utilization of the Principal Components Analysis (PCA) as a method to pre-process the original multivariate data, that is rewrite in a new matrix with principal components sorted by it's accumulated variance. The Artificial Neural Network (ANN) with backpropagation algorithm is trained, using this pre-processed data set derived from the PCA method, representing 90.02% of accumulated variance of the original data, as input. The training goal is modeling Dissolved Oxygen using information of other physical and chemical parameters. The water samples used in the experiments are gathered from the Paraíba do Sul River in São Paulo State, Brazil. The smallest Mean Square Errors (MSE) is used to compare the results of the different architectures and choose the best. The utilization of this method allowed the reduction of more than 20% of the input data, which contributed directly for the shorting time and computational effort in the ANN training.