940 resultados para Images - Computational methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The production of artistic prints in the sixteenth- and seventeenth-century Netherlands was an inherently social process. Turning out prints at any reasonable scale depended on the fluid coordination between designers, platecutters, and publishers; roles that, by the sixteenth century, were considered distinguished enough to merit distinct credits engraved on the plates themselves: invenit, fecit/sculpsit, and excudit. While any one designer, plate cutter, and publisher could potentially exercise a great deal of influence over the production of a single print, their individual decisions (Whom to select as an engraver? What subjects to create for a print design? What market to sell to?) would have been variously constrained or encouraged by their position in this larger network (Who do they already know? And who, in turn, do their contacts know?) This dissertation addresses the impact of these constraints and affordances through the novel application of computational social network analysis to major databases of surviving prints from this period. This approach is used to evaluate several questions about trends in early modern print production practices that have not been satisfactorily addressed by traditional literature based on case studies alone: Did the social capital demanded by print production result in centralized, or distributed production of prints? When, and to what extent, did printmakers and publishers in the Low countries favor international versus domestic collaborators? And were printmakers under the same pressure as painters to specialize in particular artistic genres? This dissertation ultimately suggests how simple professional incentives endemic to the practice of printmaking may, at large scales, have resulted in quite complex patterns of collaboration and production. The framework of network analysis surfaces the role of certain printmakers who tend to be neglected in aesthetically-focused histories of art. This approach also highlights important issues concerning art historians’ balancing of individual influence versus the impact of longue durée trends. Finally, this dissertation also raises questions about the current limitations and future possibilities of combining computational methods with cultural heritage datasets in the pursuit of historical research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The evaluation of the mesh opening stiffness of fishing nets is an important issue in assessing the selectivity of trawls. It appeared that a larger bending rigidity of twines decreases the mesh opening and could reduce the escapement of fish. Nevertheless, netting structure is complex. A netting is made up of braided twines made of polyethylene or polyamide. These twines are tied with non-symmetrical knots. Thus, these assemblies develop contact-friction interactions. Moreover, the netting can be subject to large deformation. In this study, we investigate the responses of netting samples to different types of solicitations. Samples are loaded and unloaded with creep and relaxation stages, with different boundary conditions. Then, two models have been developed: an analytical model and a finite element model. The last one was used to assess, with an inverse identification algorithm, the bending stiffness of twines. In this paper, experimental results and a model for netting structures made up of braided twines are presented. During dry forming of a composite, for example, the matrix is not present or not active, and relative sliding can occur between constitutive fibres. So an accurate modelling of the mechanical behaviour of fibrous material is necessary. This study offers experimental data which could permit to improve current models of contact-friction interactions [4], to validate models for large deformation analysis of fibrous materials [1] on a new experimental case, then to improve the evaluation of the mesh opening stiffness of a fishing net

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intraneural Ganglion Cyst is disorder observed in the nerve injury, it is still unknown and very difficult to predict its propagation in the human body so many times it is referred as an unsolved history. The treatments for this disorder are to remove the cystic substance from the nerve by a surgery. However these treatments may result in neuropathic pain and recurrence of the cyst. The articular theory proposed by Spinner et al., (Spinner et al. 2003) considers the neurological deficit in Common Peroneal Nerve (CPN) branch of the sciatic nerve and adds that in addition to the treatment, ligation of articular branch results into foolproof eradication of the deficit. Mechanical modeling of the affected nerve cross section will reinforce the articular theory (Spinner et al. 2003). As the cyst propagates, it compresses the neighboring fascicles and the nerve cross section appears like a signet ring. Hence, in order to mechanically model the affected nerve cross section; computational methods capable of modeling excessively large deformations are required. Traditional FEM produces distorted elements while modeling such deformations, resulting into inaccuracies and premature termination of the analysis. The methods described in research report have the capability to simulate large deformation. The results obtained from this research shows significant deformation as compared to the deformation observed in the conventional finite element models. The report elaborates the neurological deficit followed by detail explanation of the Smoothed Particle Hydrodynamic approach. Finally, the results show the large deformation in stages and also the successful implementation of the SPH method for the large deformation of the biological organ like the Intra-neural ganglion cyst.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Synthetic chemists constantly strive to develop new methodologies to access complex molecules more sustainably. The recently developed photocatalytic approach results in a valid and greener alternative to the classical synthetic methods. Here we present three protocols to furnish five-membered rings exploiting photoredox catalysis. We firstly obtained 4,5-dihydrofurans (4,5-DHFs) from readily available olefins and α-haloketones employing fac-Ir(ppy)3 as a photocatalyst under blue-light irradiation (Figure 1, top). This transformation resulted very broad in scope, thanks to its mild conditions and the avoidance of stoichiometric amounts of oxidants or reductants. Moreover, similar conditions could lead to β,γ-unsaturated ketones, or highly substituted tetrahydrofurans (THFs) by carefully differentiating the substitution pattern on the starting materials and properly adjusting the reaction parameters. We then turned our attention to the reactivity of allenamides employing analogous photocatalytic conditions to access 2-aminofurans (Figure 1, bottom). α-Haloketones again provided the radical generated by fac-Ir(ppy)3 under visible-light irradiation, which added to the π-system and furnished the cyclic molecule. The addition of a second molecule of the α-haloketone moiety led to the formation of the final highly functionalized furan, which might be further elaborated to afford more complex products. The two works were both supplied with mechanistic investigations supported by experimental and computational methods. As our last project, we developed a methodology to achieve cypentanonyl-fused N-methylpyrrolidines (Figure 2), exploiting N,N-dimethylamines and carboxylic acids as radical sources. In two separated photocatalytic steps, both functionalities are manipulated through the photoredox catalysis by 4CzIPN to add to an α,β-enone system, furnishing the bicyclic product.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study of the spectroscopic phenomena in organic solids, in combination with other techniques, is an effective tool for the understanding of the structural properties of materials based on these compounds. This Ph.D. work was dedicated to the spectroscopic investigation of some relevant processes occurring in organic molecular crystals, with the goal of expanding the knowledge on the relationship between structure, dynamics and photoreactivity of these systems. Vibrational spectroscopy has been the technique of choice, always in combination with X-ray diffraction structural studies and often the support of computational methods. The vibrational study of the molecular solid state reaches its full potential when it includes the low-wavenumber region of the lattice-phonon modes, which probe the weak intermolecular interactions and are the fingerprints of the lattice itself. Microscopy is an invaluable addition in the investigation of processes that take place in the micro-meter scale of the crystal micro-domains. In chemical and phase transitions, as well as in polymorph screening and identification, the combination of Raman microscopy and lattice-phonon detection has provided useful information. Research on the fascinating class of single-crystal-to-single-crystal photoreactions, has shown how the homogeneous mechanism of these transformations can be identified by lattice-phonon microscopy, in agreement with the continuous evolution of their XRD patterns. On describing the behavior of the photodimerization mechanism of vitamin K3, the focus was instead on the influence of its polymorphism in governing the product isomerism. Polymorphism is the additional degree of freedom of molecular functional materials, and by advancing in its control and properties, functionalities can be promoted for useful applications. Its investigation focused on thin-film phases, widely employed in organic electronics. The ambiguities in phase identification often emerging by other experimental methods were successfully solved by vibrational measurements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent decades, two prominent trends have influenced the data modeling field, namely network analysis and machine learning. This thesis explores the practical applications of these techniques within the domain of drug research, unveiling their multifaceted potential for advancing our comprehension of complex biological systems. The research undertaken during this PhD program is situated at the intersection of network theory, computational methods, and drug research. Across six projects presented herein, there is a gradual increase in model complexity. These projects traverse a diverse range of topics, with a specific emphasis on drug repurposing and safety in the context of neurological diseases. The aim of these projects is to leverage existing biomedical knowledge to develop innovative approaches that bolster drug research. The investigations have produced practical solutions, not only providing insights into the intricacies of biological systems, but also allowing the creation of valuable tools for their analysis. In short, the achievements are: • A novel computational algorithm to identify adverse events specific to fixed-dose drug combinations. • A web application that tracks the clinical drug research response to SARS-CoV-2. • A Python package for differential gene expression analysis and the identification of key regulatory "switch genes". • The identification of pivotal events causing drug-induced impulse control disorders linked to specific medications. • An automated pipeline for discovering potential drug repurposing opportunities. • The creation of a comprehensive knowledge graph and development of a graph machine learning model for predictions. Collectively, these projects illustrate diverse applications of data science and network-based methodologies, highlighting the profound impact they can have in supporting drug research activities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The benzoquinone was found as an effective co-catalyst in the ruthenium/NaOEt-catalyzed Guerbet reaction. The co-catalyst behavior has therefore been investigated through experimental and computational methods. The reaction products distribution shows that the reaction speed is improved by the benzoquinone supplement since the beginning of the process, having a minimal effect on the selectivity toward alcoholic species. DFT calculations were performed to investigate two hypotheses for the kinetic effects: i) a hydrogen storage mechanism or ii) a basic co-catalysis of 4-hydroxiphenolate. The most promising results were found for the latter hypothesis, where a new mixed mechanism for the aldol condensation step of the Guerbet process involves the hydroquinone (i.e. the reduced form of benzoquinone) as proton source instead of ethanol. This mechanism was found to be energetically more favorable than an aldol condensation in absence of additive, suggesting that the hydroquinone derived from benzoquinone could be the key species affecting the kinetics of the overall process. To verify this theoretical hypothesis, new phenol derivatives were tested as additives in the Guerbet reaction. The outcomes confirmed that an aromatic acid (stronger than ethanol) could improve the reaction kinetics. Lastly, theoretical products distributions were simulated and compared to the experimental one, using the DFT computations to build the kinetic models.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Some fundamental biological processes such as embryonic development have been preserved during evolution and are common to species belonging to different phylogenetic positions, but are nowadays largely unknown. The understanding of cell morphodynamics leading to the formation of organized spatial distribution of cells such as tissues and organs can be achieved through the reconstruction of cells shape and position during the development of a live animal embryo. We design in this work a chain of image processing methods to automatically segment and track cells nuclei and membranes during the development of a zebrafish embryo, which has been largely validates as model organism to understand vertebrate development, gene function and healingrepair mechanisms in vertebrates. The embryo is previously labeled through the ubiquitous expression of fluorescent proteins addressed to cells nuclei and membranes, and temporal sequences of volumetric images are acquired with laser scanning microscopy. Cells position is detected by processing nuclei images either through the generalized form of the Hough transform or identifying nuclei position with local maxima after a smoothing preprocessing step. Membranes and nuclei shapes are reconstructed by using PDEs based variational techniques such as the Subjective Surfaces and the Chan Vese method. Cells tracking is performed by combining informations previously detected on cells shape and position with biological regularization constraints. Our results are manually validated and reconstruct the formation of zebrafish brain at 7-8 somite stage with all the cells tracked starting from late sphere stage with less than 2% error for at least 6 hours. Our reconstruction opens the way to a systematic investigation of cellular behaviors, of clonal origin and clonal complexity of brain organs, as well as the contribution of cell proliferation modes and cell movements to the formation of local patterns and morphogenetic fields.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Fluoroscopic images exhibit severe signal-dependent quantum noise, due to the reduced X-ray dose involved in image formation, that is generally modelled as Poisson-distributed. However, image gray-level transformations, commonly applied by fluoroscopic device to enhance contrast, modify the noise statistics and the relationship between image noise variance and expected pixel intensity. Image denoising is essential to improve quality of fluoroscopic images and their clinical information content. Simple average filters are commonly employed in real-time processing, but they tend to blur edges and details. An extensive comparison of advanced denoising algorithms specifically designed for both signal-dependent noise (AAS, BM3Dc, HHM, TLS) and independent additive noise (AV, BM3D, K-SVD) was presented. Simulated test images degraded by various levels of Poisson quantum noise and real clinical fluoroscopic images were considered. Typical gray-level transformations (e.g. white compression) were also applied in order to evaluate their effect on the denoising algorithms. Performances of the algorithms were evaluated in terms of peak-signal-to-noise ratio (PSNR), signal-to-noise ratio (SNR), mean square error (MSE), structural similarity index (SSIM) and computational time. On average, the filters designed for signal-dependent noise provided better image restorations than those assuming additive white Gaussian noise (AWGN). Collaborative denoising strategy was found to be the most effective in denoising of both simulated and real data, also in the presence of image gray-level transformations. White compression, by inherently reducing the greater noise variance of brighter pixels, appeared to support denoising algorithms in performing more effectively. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We evaluated the performance of a novel procedure for segmenting mammograms and detecting clustered microcalcifications in two types of image sets obtained from digitization of mammograms using either a laser scanner, or a conventional ""optical"" scanner. Specific regions forming the digital mammograms were identified and selected, in which clustered microcalcifications appeared or not. A remarkable increase in image intensity was noticed in the images from the optical scanner compared with the original mammograms. A procedure based on a polynomial correction was developed to compensate the changes in the characteristic curves from the scanners, relative to the curves from the films. The processing scheme was applied to both sets, before and after the polynomial correction. The results indicated clearly the influence of the mammogram digitization on the performance of processing schemes intended to detect microcalcifications. The image processing techniques applied to mammograms digitized by both scanners, without the polynomial intensity correction, resulted in a better sensibility in detecting microcalcifications in the images from the laser scanner. However, when the polynomial correction was applied to the images from the optical scanner, no differences in performance were observed for both types of images. (C) 2008 SPIE and IS&T [DOI: 10.1117/1.3013544]

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Regulating mechanisms of branching morphogenesis of fetal lung rat explants have been an essential tool for molecular research. This work presents a new methodology to accurately quantify the epithelial, outer contour and peripheral airway buds of lung explants during cellular development from microscopic images. Methods: The outer contour was defined using an adaptive and multi-scale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelial was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds were counted as the skeleton branched ends from a skeletonized image of the lung inner epithelial. Results: The time for lung branching morphometric analysis was reduced in 98% in contrast to the manual method. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Non-significant differences were found between the automatic and manual results in all culture days. Conclusions: The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lightning characteristics and allowing a reliable comparison between different researchers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation presented at the Faculty of Science and Technology of the New University of Lisbon in fulfillment of the requirements for the Masters degree in Electrical Engineering and Computers

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.