940 resultados para Images - Computational methods
Resumo:
As formulações baseadas na mecânica do contínuo, embora precisas até certo ponto, por vezes não podem ser utilizadas, ou não são conceitualmente corretas para o entendimento de fenômenos em escalas reduzidas. Estas limitações podem aparecer no estudo dos fenômenos tribológicos em escala nanométrica, que passam a necessitar de novos métodos experimentais, teóricos e computacionais que permitam explorar estes fenômenos com a resolução necessária. Simulações atomísticas são capazes de descrever fenômenos em pequena escala, porém, o número necessário de átomos modelados e, portanto, o custo computacional - geralmente torna-se bastante elevado. Por outro lado, os métodos de simulação associados à mecânica do contínuo são mais interessantes em relação ao custo computacional, mas não são precisos na escala atômica. A combinação entre essas duas abordagens pode, então, permitir uma compreensão mais realista dos fenômenos da tribologia. Neste trabalho, discutem-se os conceitos básicos e modelos de atrito em escala atômica e apresentam-se estudos, por meio de simulação numérica, para a análise e compreensão dos mecanismos de atrito e desgaste no contato entre materiais. O problema é abordado em diferentes escalas, e propõe-se uma abordagem conjunta entre a Mecânica do Contínuo e a Dinâmica Molecular. Para tanto, foram executadas simulações numéricas, com complexidade crescente, do contato entre superfícies, partindo-se de um primeiro modelo que simula o efeito de defeitos cristalinos no fenômeno de escorregamento puro, considerando a Dinâmica Molecular. Posteriormente, inseriu-se, nos modelos da mecânica do contínuo, considerações sobre o fenômeno de adesão. A validação dos resultados é feita pela comparação entre as duas abordagens e com a literatura.
Resumo:
Using a combination of experimental and computational methods, mainly FTIR and DFT calculations, new insights are provided here in order to better understand the cleavage of the C–C bond taking place during the complete oxidation of ethanol on platinum stepped surfaces. First, new experimental results pointing out that platinum stepped surfaces having (111) terraces promote the C–C bond breaking are presented. Second, it is computationally shown that the special adsorption properties of the atoms in the step are able to promote the C–C scission, provided that no other adsorbed species are present on the step, which is in agreement with the experimental results. In comparison with the (111) terrace, the cleavage of the C–C bond on the step has a significantly lower activation energy, which would provide an explanation for the observed experimental results. Finally, reactivity differences under acidic and alkaline conditions are discussed using the new experimental and theoretical evidence.
Resumo:
Numerical modelling methodologies are important by their application to engineering and scientific problems, because there are processes where analytical mathematical expressions cannot be obtained to model them. When the only available information is a set of experimental values for the variables that determine the state of the system, the modelling problem is equivalent to determining the hyper-surface that best fits the data. This paper presents a methodology based on the Galerkin formulation of the finite elements method to obtain representations of relationships that are defined a priori, between a set of variables: y = z(x1, x2,...., xd). These representations are generated from the values of the variables in the experimental data. The approximation, piecewise, is an element of a Sobolev space and has derivatives defined in a general sense into this space. The using of this approach results in the need of inverting a linear system with a structure that allows a fast solver algorithm. The algorithm can be used in a variety of fields, being a multidisciplinary tool. The validity of the methodology is studied considering two real applications: a problem in hydrodynamics and a problem of engineering related to fluids, heat and transport in an energy generation plant. Also a test of the predictive capacity of the methodology is performed using a cross-validation method.
Resumo:
The international perspectives on these issues are especially valuable in an increasingly connected, but still institutionally and administratively diverse world. The research addressed in several chapters in this volume includes issues around technical standards bodies like EpiDoc and the TEI, engaging with ways these standards are implemented, documented, taught, used in the process of transcribing and annotating texts, and used to generate publications and as the basis for advanced textual or corpus research. Other chapters focus on various aspects of philological research and content creation, including collaborative or community driven efforts, and the issues surrounding editorial oversight, curation, maintenance and sustainability of these resources. Research into the ancient languages and linguistics, in particular Greek, and the language teaching that is a staple of our discipline, are also discussed in several chapters, in particular for ways in which advanced research methods can lead into language technologies and vice versa and ways in which the skills around teaching can be used for public engagement, and vice versa. A common thread through much of the volume is the importance of open access publication or open source development and distribution of texts, materials, tools and standards, both because of the public good provided by such models (circulating materials often already paid for out of the public purse), and the ability to reach non-standard audiences, those who cannot access rich university libraries or afford expensive print volumes. Linked Open Data is another technology that results in wide and free distribution of structured information both within and outside academic circles, and several chapters present academic work that includes ontologies and RDF, either as a direct research output or as essential part of the communication and knowledge representation. Several chapters focus not on the literary and philological side of classics, but on the study of cultural heritage, archaeology, and the material supports on which original textual and artistic material are engraved or otherwise inscribed, addressing both the capture and analysis of artefacts in both 2D and 3D, the representation of data through archaeological standards, and the importance of sharing information and expertise between the several domains both within and without academia that study, record and conserve ancient objects. Almost without exception, the authors reflect on the issues of interdisciplinarity and collaboration, the relationship between their research practice and teaching and/or communication with a wider public, and the importance of the role of the academic researcher in contemporary society and in the context of cutting edge technologies. How research is communicated in a world of instant- access blogging and 140-character micromessaging, and how our expectations of the media affect not only how we publish but how we conduct our research, are questions about which all scholars need to be aware and self-critical.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Manual curation has long been held to be the gold standard for functional annotation of DNA sequence. Our experience with the annotation of more than 20,000 full-length cDNA sequences revealed problems with this approach, including inaccurate and inconsistent assignment of gene names, as well as many good assignments that were difficult to reproduce using only computational methods. For the FANTOM2 annotation of more than 60,000 cDNA clones, we developed a number of methods and tools to circumvent some of these problems, including an automated annotation pipeline that provides high-quality preliminary annotation for each sequence by introducing an uninformative filter that eliminates uninformative annotations, controlled vocabularies to accurately reflect both the functional assignments and the evidence supporting them, and a highly refined, Web-based manual annotation tool that allows users to view a wide array of sequence analyses and to assign gene names and putative functions using a consistent nomenclature. The ultimate utility of our approach is reflected in the low rate of reassignment of automated assignments by manual curation. Based on these results, we propose a new standard for large-scale annotation, in which the initial automated annotations are manually investigated and then computational methods are iteratively modified and improved based on the results of manual curation.
Resumo:
Changes in arterial distensibility have been widely used to identify the presence of cardiovascular abnormalities like hypertension. Pulse wave velocity (PWV) has shown to be related to arterial distensibility. However, the lack of suitable techniques to measure PWV nonintrusively has impeded its clinical usefulness. Pulse transit time (PTT) is a noninvasive technique derived from the principle of PWV. PTT has shown its capabilities in cardiovascular and cardiorespiratory studies in adults. However, no known study has been conducted to understand the suitability and utility of PTT to estimate PWV in children. Two computational methods to derive PWV from PTT values obtained from 23 normotensive Caucasian children (19 males, aged 5-12 years old) from their finger and toe were conducted. Furthermore, the effects of adopting different postures on the PWV derivations were investigated. Statistical analyses were performed in comparison with two previous PWV studies conducted on children. Results revealed that PWV derived from the upper limb correlated significantly (P
Resumo:
Epitopes mediated by T cells lie at the heart of the adaptive immune response and form the essential nucleus of anti-tumour peptide or epitope-based vaccines. Antigenic T cell epitopes are mediated by major histocompatibility complex (MHC) molecules, which present them to T cell receptors. Calculating the affinity between a given MHC molecule and an antigenic peptide using experimental approaches is both difficult and time consuming, thus various computational methods have been developed for this purpose. A server has been developed to allow a structural approach to the problem by generating specific MHC:peptide complex structures and providing configuration files to run molecular modelling simulations upon them. A system has been produced which allows the automated construction of MHC:peptide structure files and the corresponding configuration files required to execute a molecular dynamics simulation using NAMD. The system has been made available through a web-based front end and stand-alone scripts. Previous attempts at structural prediction of MHC:peptide affinity have been limited due to the paucity of structures and the computational expense in running large scale molecular dynamics simulations. The MHCsim server (http://igrid-ext.cryst.bbk.ac.uk/MHCsim) allows the user to rapidly generate any desired MHC:peptide complex and will facilitate molecular modelling simulation of MHC complexes on an unprecedented scale.
Resumo:
The papers resulting from the recent Biochemical Society Focused Meeting 'G-Protein-Coupled Receptors: from Structural Insights to Functional Mechanisms' held in Prato in September 2012 are introduced in the present overview. A number of future goals for GPCR (G-protein-coupled receptor) research are considered, including the need to develop biophysical and computational methods to explore the full range of GPCR conformations and their dynamics, the need to develop methods to take this into account for drug discovery and the importance of relating observations on isolated receptors or receptors expressed in model systems to receptor function in vivo. © 2013 Biochemical Society.
Resumo:
We consider a Cauchy problem for the Laplace equation in a bounded region containing a cut, where the region is formed by removing a sufficiently smooth arc (the cut) from a bounded simply connected domain D. The aim is to reconstruct the solution on the cut from the values of the solution and its normal derivative on the boundary of the domain D. We propose an alternating iterative method which involves solving direct mixed problems for the Laplace operator in the same region. These mixed problems have either a Dirichlet or a Neumann boundary condition imposed on the cut and are solved by a potential approach. Each of these mixed problems is reduced to a system of integral equations of the first kind with logarithmic and hypersingular kernels and at most a square root singularity in the densities at the endpoints of the cut. The full discretization of the direct problems is realized by a trigonometric quadrature method which has super-algebraic convergence. The numerical examples presented illustrate the feasibility of the proposed method.
Resumo:
The twin arginine translocation (TAT) system ferries folded proteins across the bacterial membrane. Proteins are directed into this system by the TAT signal peptide present at the amino terminus of the precursor protein, which contains the twin arginine residues that give the system its name. There are currently only two computational methods for the prediction of TAT translocated proteins from sequence. Both methods have limitations that make the creation of a new algorithm for TAT-translocated protein prediction desirable. We have developed TATPred, a new sequence-model method, based on a Nave-Bayesian network, for the prediction of TAT signal peptides. In this approach, a comprehensive range of models was tested to identify the most reliable and robust predictor. The best model comprised 12 residues: three residues prior to the twin arginines and the seven residues that follow them. We found a prediction sensitivity of 0.979 and a specificity of 0.942.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
A survey of crystal structures containing hydantoin, dihydrouracil and uracil derivatives in the Cambridge Structural Database revealed four main types of hydrogen bond motifs when derivatives with extra substituents able to interfere with the main motif are excluded. All these molecules contain two hydrogen bond donors and two hydrogen bond acceptors in the sequence of NH, C = O, NH, and C=O groups within a 5-membered ring (hydantoin) and two 6-membered rings (dihydrouracil and uracil). In all cases, both ring NH groups act as donors in the main hydrogen bond motif but there is an excess of hydrogen bond acceptors (two C=O able to accept twice each) and so two possibilities are found: (i) each carbonyl O atom may accept one hydrogen bond or (ii) one carbonyl O atom may accept two hydrogen bonds while the other does not participate in the hydrogen bonding. We observed different preferences in the type and symmetry of the motifs adopted by the different derivatives, and a good agreement is found between motifs observed experimentally and those predicted using computational methods. We identified certain molecular factors such as chirality, substituent size and the possibility of C-H⋯O interactions as important factors influencing the motif observation. © 2012 The Royal Society of Chemistry and the Centre National de la Recherche Scientifique.
Resumo:
MSC 2010: 33C47, 42C05, 41A55, 65D30, 65D32