970 resultados para Computational methods
Resumo:
While news stories are an important traditional medium to broadcast and consume news, microblogging has recently emerged as a place where people can dis- cuss, disseminate, collect or report information about news. However, the massive information in the microblogosphere makes it hard for readers to keep up with these real-time updates. This is especially a problem when it comes to breaking news, where people are more eager to know “what is happening”. Therefore, this dis- sertation is intended as an exploratory effort to investigate computational methods to augment human effort when monitoring the development of breaking news on a given topic from a microblog stream by extractively summarizing the updates in a timely manner. More specifically, given an interest in a topic, either entered as a query or presented as an initial news report, a microblog temporal summarization system is proposed to filter microblog posts from a stream with three primary concerns: topical relevance, novelty, and salience. Considering the relatively high arrival rate of microblog streams, a cascade framework consisting of three stages is proposed to progressively reduce quantity of posts. For each step in the cascade, this dissertation studies methods that improve over current baselines. In the relevance filtering stage, query and document expansion techniques are applied to mitigate sparsity and vocabulary mismatch issues. The use of word embedding as a basis for filtering is also explored, using unsupervised and supervised modeling to characterize lexical and semantic similarity. In the novelty filtering stage, several statistical ways of characterizing novelty are investigated and ensemble learning techniques are used to integrate results from these diverse techniques. These results are compared with a baseline clustering approach using both standard and delay-discounted measures. In the salience filtering stage, because of the real-time prediction requirement a method of learning verb phrase usage from past relevant news reports is used in conjunction with some standard measures for characterizing writing quality. Following a Cranfield-like evaluation paradigm, this dissertation includes a se- ries of experiments to evaluate the proposed methods for each step, and for the end- to-end system. New microblog novelty and salience judgments are created, building on existing relevance judgments from the TREC Microblog track. The results point to future research directions at the intersection of social media, computational jour- nalism, information retrieval, automatic summarization, and machine learning.
Resumo:
A teoria de jogos modela estratégias entre agentes (jogadores), os quais possuem recompensas ao fim do jogo conforme suas ações. O melhor par de estratégias para os jogadores constitui uma solução de equilíbrio. Porém, nem sempre se consegue estimar os dados do problema. Diante disso, os parâmetros incertos presentes em modelos de jogos são formalizados pela teoria fuzzy. Assim, a teoria fuzzy auxilia a teoria de jogos, formando jogos fuzzy. Dessa forma, parâmetros, como as recompensas, tornam-se números fuzzy. Mais ainda, quando há incerteza na representação desses números fuzzy utilizam-se os números fuzzy intervalares. Então, neste trabalho modelos de jogos fuzzy intervalares são analisados e métodos computacionais são desenvolvidos para a resolução desses jogos. Por fim, realizam-se simulações de programação linear para observar melhor a aplicação das teorias estudadas e avaliar a proposta.
Resumo:
The production of artistic prints in the sixteenth- and seventeenth-century Netherlands was an inherently social process. Turning out prints at any reasonable scale depended on the fluid coordination between designers, platecutters, and publishers; roles that, by the sixteenth century, were considered distinguished enough to merit distinct credits engraved on the plates themselves: invenit, fecit/sculpsit, and excudit. While any one designer, plate cutter, and publisher could potentially exercise a great deal of influence over the production of a single print, their individual decisions (Whom to select as an engraver? What subjects to create for a print design? What market to sell to?) would have been variously constrained or encouraged by their position in this larger network (Who do they already know? And who, in turn, do their contacts know?) This dissertation addresses the impact of these constraints and affordances through the novel application of computational social network analysis to major databases of surviving prints from this period. This approach is used to evaluate several questions about trends in early modern print production practices that have not been satisfactorily addressed by traditional literature based on case studies alone: Did the social capital demanded by print production result in centralized, or distributed production of prints? When, and to what extent, did printmakers and publishers in the Low countries favor international versus domestic collaborators? And were printmakers under the same pressure as painters to specialize in particular artistic genres? This dissertation ultimately suggests how simple professional incentives endemic to the practice of printmaking may, at large scales, have resulted in quite complex patterns of collaboration and production. The framework of network analysis surfaces the role of certain printmakers who tend to be neglected in aesthetically-focused histories of art. This approach also highlights important issues concerning art historians’ balancing of individual influence versus the impact of longue durée trends. Finally, this dissertation also raises questions about the current limitations and future possibilities of combining computational methods with cultural heritage datasets in the pursuit of historical research.
Resumo:
The evaluation of the mesh opening stiffness of fishing nets is an important issue in assessing the selectivity of trawls. It appeared that a larger bending rigidity of twines decreases the mesh opening and could reduce the escapement of fish. Nevertheless, netting structure is complex. A netting is made up of braided twines made of polyethylene or polyamide. These twines are tied with non-symmetrical knots. Thus, these assemblies develop contact-friction interactions. Moreover, the netting can be subject to large deformation. In this study, we investigate the responses of netting samples to different types of solicitations. Samples are loaded and unloaded with creep and relaxation stages, with different boundary conditions. Then, two models have been developed: an analytical model and a finite element model. The last one was used to assess, with an inverse identification algorithm, the bending stiffness of twines. In this paper, experimental results and a model for netting structures made up of braided twines are presented. During dry forming of a composite, for example, the matrix is not present or not active, and relative sliding can occur between constitutive fibres. So an accurate modelling of the mechanical behaviour of fibrous material is necessary. This study offers experimental data which could permit to improve current models of contact-friction interactions [4], to validate models for large deformation analysis of fibrous materials [1] on a new experimental case, then to improve the evaluation of the mesh opening stiffness of a fishing net
Resumo:
Intraneural Ganglion Cyst is disorder observed in the nerve injury, it is still unknown and very difficult to predict its propagation in the human body so many times it is referred as an unsolved history. The treatments for this disorder are to remove the cystic substance from the nerve by a surgery. However these treatments may result in neuropathic pain and recurrence of the cyst. The articular theory proposed by Spinner et al., (Spinner et al. 2003) considers the neurological deficit in Common Peroneal Nerve (CPN) branch of the sciatic nerve and adds that in addition to the treatment, ligation of articular branch results into foolproof eradication of the deficit. Mechanical modeling of the affected nerve cross section will reinforce the articular theory (Spinner et al. 2003). As the cyst propagates, it compresses the neighboring fascicles and the nerve cross section appears like a signet ring. Hence, in order to mechanically model the affected nerve cross section; computational methods capable of modeling excessively large deformations are required. Traditional FEM produces distorted elements while modeling such deformations, resulting into inaccuracies and premature termination of the analysis. The methods described in research report have the capability to simulate large deformation. The results obtained from this research shows significant deformation as compared to the deformation observed in the conventional finite element models. The report elaborates the neurological deficit followed by detail explanation of the Smoothed Particle Hydrodynamic approach. Finally, the results show the large deformation in stages and also the successful implementation of the SPH method for the large deformation of the biological organ like the Intra-neural ganglion cyst.
Resumo:
Synthetic chemists constantly strive to develop new methodologies to access complex molecules more sustainably. The recently developed photocatalytic approach results in a valid and greener alternative to the classical synthetic methods. Here we present three protocols to furnish five-membered rings exploiting photoredox catalysis. We firstly obtained 4,5-dihydrofurans (4,5-DHFs) from readily available olefins and α-haloketones employing fac-Ir(ppy)3 as a photocatalyst under blue-light irradiation (Figure 1, top). This transformation resulted very broad in scope, thanks to its mild conditions and the avoidance of stoichiometric amounts of oxidants or reductants. Moreover, similar conditions could lead to β,γ-unsaturated ketones, or highly substituted tetrahydrofurans (THFs) by carefully differentiating the substitution pattern on the starting materials and properly adjusting the reaction parameters. We then turned our attention to the reactivity of allenamides employing analogous photocatalytic conditions to access 2-aminofurans (Figure 1, bottom). α-Haloketones again provided the radical generated by fac-Ir(ppy)3 under visible-light irradiation, which added to the π-system and furnished the cyclic molecule. The addition of a second molecule of the α-haloketone moiety led to the formation of the final highly functionalized furan, which might be further elaborated to afford more complex products. The two works were both supplied with mechanistic investigations supported by experimental and computational methods. As our last project, we developed a methodology to achieve cypentanonyl-fused N-methylpyrrolidines (Figure 2), exploiting N,N-dimethylamines and carboxylic acids as radical sources. In two separated photocatalytic steps, both functionalities are manipulated through the photoredox catalysis by 4CzIPN to add to an α,β-enone system, furnishing the bicyclic product.
Resumo:
The study of the spectroscopic phenomena in organic solids, in combination with other techniques, is an effective tool for the understanding of the structural properties of materials based on these compounds. This Ph.D. work was dedicated to the spectroscopic investigation of some relevant processes occurring in organic molecular crystals, with the goal of expanding the knowledge on the relationship between structure, dynamics and photoreactivity of these systems. Vibrational spectroscopy has been the technique of choice, always in combination with X-ray diffraction structural studies and often the support of computational methods. The vibrational study of the molecular solid state reaches its full potential when it includes the low-wavenumber region of the lattice-phonon modes, which probe the weak intermolecular interactions and are the fingerprints of the lattice itself. Microscopy is an invaluable addition in the investigation of processes that take place in the micro-meter scale of the crystal micro-domains. In chemical and phase transitions, as well as in polymorph screening and identification, the combination of Raman microscopy and lattice-phonon detection has provided useful information. Research on the fascinating class of single-crystal-to-single-crystal photoreactions, has shown how the homogeneous mechanism of these transformations can be identified by lattice-phonon microscopy, in agreement with the continuous evolution of their XRD patterns. On describing the behavior of the photodimerization mechanism of vitamin K3, the focus was instead on the influence of its polymorphism in governing the product isomerism. Polymorphism is the additional degree of freedom of molecular functional materials, and by advancing in its control and properties, functionalities can be promoted for useful applications. Its investigation focused on thin-film phases, widely employed in organic electronics. The ambiguities in phase identification often emerging by other experimental methods were successfully solved by vibrational measurements.
Resumo:
In recent decades, two prominent trends have influenced the data modeling field, namely network analysis and machine learning. This thesis explores the practical applications of these techniques within the domain of drug research, unveiling their multifaceted potential for advancing our comprehension of complex biological systems. The research undertaken during this PhD program is situated at the intersection of network theory, computational methods, and drug research. Across six projects presented herein, there is a gradual increase in model complexity. These projects traverse a diverse range of topics, with a specific emphasis on drug repurposing and safety in the context of neurological diseases. The aim of these projects is to leverage existing biomedical knowledge to develop innovative approaches that bolster drug research. The investigations have produced practical solutions, not only providing insights into the intricacies of biological systems, but also allowing the creation of valuable tools for their analysis. In short, the achievements are: • A novel computational algorithm to identify adverse events specific to fixed-dose drug combinations. • A web application that tracks the clinical drug research response to SARS-CoV-2. • A Python package for differential gene expression analysis and the identification of key regulatory "switch genes". • The identification of pivotal events causing drug-induced impulse control disorders linked to specific medications. • An automated pipeline for discovering potential drug repurposing opportunities. • The creation of a comprehensive knowledge graph and development of a graph machine learning model for predictions. Collectively, these projects illustrate diverse applications of data science and network-based methodologies, highlighting the profound impact they can have in supporting drug research activities.
Resumo:
The benzoquinone was found as an effective co-catalyst in the ruthenium/NaOEt-catalyzed Guerbet reaction. The co-catalyst behavior has therefore been investigated through experimental and computational methods. The reaction products distribution shows that the reaction speed is improved by the benzoquinone supplement since the beginning of the process, having a minimal effect on the selectivity toward alcoholic species. DFT calculations were performed to investigate two hypotheses for the kinetic effects: i) a hydrogen storage mechanism or ii) a basic co-catalysis of 4-hydroxiphenolate. The most promising results were found for the latter hypothesis, where a new mixed mechanism for the aldol condensation step of the Guerbet process involves the hydroquinone (i.e. the reduced form of benzoquinone) as proton source instead of ethanol. This mechanism was found to be energetically more favorable than an aldol condensation in absence of additive, suggesting that the hydroquinone derived from benzoquinone could be the key species affecting the kinetics of the overall process. To verify this theoretical hypothesis, new phenol derivatives were tested as additives in the Guerbet reaction. The outcomes confirmed that an aromatic acid (stronger than ethanol) could improve the reaction kinetics. Lastly, theoretical products distributions were simulated and compared to the experimental one, using the DFT computations to build the kinetic models.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.
Resumo:
[English] This paper is a tutorial introduction to pseudospectral optimal control. With pseudospectral methods, a function is approximated as a linear combination of smooth basis functions, which are often chosen to be Legendre or Chebyshev polynomials. Collocation of the differential-algebraic equations is performed at orthogonal collocation points, which are selected to yield interpolation of high accuracy. Pseudospectral methods directly discretize the original optimal control problem to recast it into a nonlinear programming format. A numerical optimizer is then employed to find approximate local optimal solutions. The paper also briefly describes the functionality and implementation of PSOPT, an open source software package written in C++ that employs pseudospectral discretization methods to solve multi-phase optimal control problems. The software implements the Legendre and Chebyshev pseudospectral methods, and it has useful features such as automatic differentiation, sparsity detection, and automatic scaling. The use of pseudospectral methods is illustrated in two problems taken from the literature on computational optimal control. [Portuguese] Este artigo e um tutorial introdutorio sobre controle otimo pseudo-espectral. Em metodos pseudo-espectrais, uma funcao e aproximada como uma combinacao linear de funcoes de base suaves, tipicamente escolhidas como polinomios de Legendre ou Chebyshev. A colocacao de equacoes algebrico-diferenciais e realizada em pontos de colocacao ortogonal, que sao selecionados de modo a minimizar o erro de interpolacao. Metodos pseudoespectrais discretizam o problema de controle otimo original de modo a converte-lo em um problema de programa cao nao-linear. Um otimizador numerico e entao empregado para obter solucoes localmente otimas. Este artigo tambem descreve sucintamente a funcionalidade e a implementacao de um pacote computacional de codigo aberto escrito em C++ chamado PSOPT. Tal pacote emprega metodos de discretizacao pseudo-spectrais para resolver problemas de controle otimo com multiplas fase. O PSOPT permite a utilizacao de metodos de Legendre ou Chebyshev, e possui caractersticas uteis tais como diferenciacao automatica, deteccao de esparsidade e escalonamento automatico. O uso de metodos pseudo-espectrais e ilustrado em dois problemas retirados da literatura de controle otimo computacional.
Resumo:
This article is dedicated to harmonic wavelet Galerkin methods for the solution of partial differential equations. Several variants of the method are proposed and analyzed, using the Burgers equation as a test model. The computational complexity can be reduced when the localization properties of the wavelets and restricted interactions between different scales are exploited. The resulting variants of the method have computational complexities ranging from O(N(3)) to O(N) (N being the space dimension) per time step. A pseudo-spectral wavelet scheme is also described and compared to the methods based on connection coefficients. The harmonic wavelet Galerkin scheme is applied to a nonlinear model for the propagation of precipitation fronts, with the front locations being exposed in the sizes of the localized wavelet coefficients. (C) 2011 Elsevier Ltd. All rights reserved.