935 resultados para super-dense computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove the existence and local uniqueness of invariant tori on the verge of breakdown for two systems: the quasi-periodically driven logistic map and the quasi-periodically forced standard map. These systems exemplify two scenarios: the Heagy-Hammel route for the creation of strange non- chaotic attractors and the nonsmooth bifurcation of saddle invariant tori. Our proofs are computer- assisted and are based on a tailored version of the Newton-Kantorovich theorem. The proofs cannot be performed using classical perturbation theory because the two scenarios are very far from the perturbative regime, and fundamental hypotheses such as reducibility or hyperbolicity either do not hold or are very close to failing. Our proofs are based on a reliable computation of the invariant tori and a careful study of their dynamical properties, leading to the rigorous validation of the numerical results with our novel computational techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although fetal anatomy can be adequately viewed in new multi-slice MR images, many critical limitations remain for quantitative data analysis. To this end, several research groups have recently developed advanced image processing methods, often denoted by super-resolution (SR) techniques, to reconstruct from a set of clinical low-resolution (LR) images, a high-resolution (HR) motion-free volume. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has been quite attracted by Total Variation energies because of their ability in edge preserving but only standard explicit steepest gradient techniques have been applied for optimization. In a preliminary work, it has been shown that novel fast convex optimization techniques could be successfully applied to design an efficient Total Variation optimization algorithm for the super-resolution problem. In this work, two major contributions are presented. Firstly, we will briefly review the Bayesian and Variational dual formulations of current state-of-the-art methods dedicated to fetal MRI reconstruction. Secondly, we present an extensive quantitative evaluation of our SR algorithm previously introduced on both simulated fetal and real clinical data (with both normal and pathological subjects). Specifically, we study the robustness of regularization terms in front of residual registration errors and we also present a novel strategy for automatically select the weight of the regularization as regards the data fidelity term. Our results show that our TV implementation is highly robust in front of motion artifacts and that it offers the best trade-off between speed and accuracy for fetal MRI recovery as in comparison with state-of-the art methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: We propose and validate a computer aided system to measure three different mandibular indexes: cortical width, panoramic mandibular index and, mandibular alveolar bone resorption index. Study Design: Repeatability and reproducibility of the measurements are analyzed and compared to the manual estimation of the same indexes. Results: The proposed computerized system exhibits superior repeatability and reproducibility rates compared to standard manual methods. Moreover, the time required to perform the measurements using the proposed method is negligible compared to perform the measurements manually. Conclusions: We have proposed a very user friendly computerized method to measure three different morphometric mandibular indexes. From the results we can conclude that the system provides a practical manner to perform these measurements. It does not require an expert examiner and does not take more than 16 seconds per analysis. Thus, it may be suitable to diagnose osteoporosis using dental panoramic radiographs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an algorithm for the computation of reducible invariant tori of discrete dynamical systems that is suitable for tori of dimensions larger than 1. It is based on a quadratically convergent scheme that approximates, at the same time, the Fourier series of the torus, its Floquet transformation, and its Floquet matrix. The Floquet matrix describes the linearization of the dynamics around the torus and, hence, its linear stability. The algorithm presents a high degree of parallelism, and the computational effort grows linearly with the number of Fourier modes needed to represent the solution. For these reasons it is a very good option to compute quasi-periodic solutions with several basic frequencies. The paper includes some examples (flows) to show the efficiency of the method in a parallel computer. In these flows we compute invariant tori of dimensions up to 5, by taking suitable sections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integrating single nucleotide polymorphism (SNP) p-values from genome-wide association studies (GWAS) across genes and pathways is a strategy to improve statistical power and gain biological insight. Here, we present Pascal (Pathway scoring algorithm), a powerful tool for computing gene and pathway scores from SNP-phenotype association summary statistics. For gene score computation, we implemented analytic and efficient numerical solutions to calculate test statistics. We examined in particular the sum and the maximum of chi-squared statistics, which measure the strongest and the average association signals per gene, respectively. For pathway scoring, we use a modified Fisher method, which offers not only significant power improvement over more traditional enrichment strategies, but also eliminates the problem of arbitrary threshold selection inherent in any binary membership based pathway enrichment approach. We demonstrate the marked increase in power by analyzing summary statistics from dozens of large meta-studies for various traits. Our extensive testing indicates that our method not only excels in rigorous type I error control, but also results in more biologically meaningful discoveries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Objective: Derive filtered tungsten X-ray spectra used in digital mammography systems by means of Monte Carlo simulations. Materials and Methods: Filtered spectra for rhodium filter were obtained for tube potentials between 26 and 32 kV. The half-value layer (HVL) of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F & MAM Detector Platinum and 8201023-C Xi Base unit Platinum Plus w mAs in a Hologic Selenia Dimensions system using a direct radiography mode. Results: Calculated HVL values showed good agreement as compared with those obtained experimentally. The greatest relative difference between the Monte Carlo calculated HVL values and experimental HVL values was 4%. Conclusion: The results show that the filtered tungsten anode X-ray spectra and the EGSnrc Monte Carlo code can be used for mean glandular dose determination in mammography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to shed light on the main physical processes controlling fragmentation of massive dense cores, we present a uniform study of the density structure of 19 massive dense cores, selected to be at similar evolutionary stages, for which their relative fragmentation level was assessed in a previous work. We inferred the density structure of the 19 cores through a simultaneous fit of the radial intensity profiles at 450 and 850 μm (or 1.2 mm in two cases) and the spectral energy distribution, assuming spherical symmetry and that the density and temperature of the cores decrease with radius following power-laws. Even though the estimated fragmentation level is strictly speaking a lower limit, its relative value is significant and several trends could be explored with our data. We find a weak (inverse) trend of fragmentation level and density power-law index, with steeper density profiles tending to show lower fragmentation, and vice versa. In addition, we find a trend of fragmentation increasing with density within a given radius, which arises from a combination of flat density profile and high central density and is consistent with Jeans fragmentation. We considered the effects of rotational-to-gravitational energy ratio, non-thermal velocity dispersion, and turbulence mode on the density structure of the cores, and found that compressive turbulence seems to yield higher central densities. Finally, a possible explanation for the origin of cores with concentrated density profiles, which are the cores showing no fragmentation, could be related with a strong magnetic field, consistent with the outcome of radiation magnetohydrodynamic simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this thesis is to shed light on the vertical vibration of granular materials for potential interest in the power generation industry. The main focus is investigating the drag force and frictional resistance that influence the movement of a granular material (in the form of glass beads) contained in a vessel, which is subjected to sinusoidal oscillation. The thesis is divided into three parts: theoretical analysis, experiments and computer simulations. The theoretical part of this study presents the underlying physical phenomena of the vibration of granular materials. Experiments are designed to determine fundamental parameters that contribute to the behavior of vibrating granular media. Numerical simulations include the use of three different software applications: FLUENT, LS-DYNA and ANSYS Workbench. The goal of these simulations is to test theoretical and semiempirical models for granular materials in order to validate their compatibility with the experimental findings, to assist in predicting their behavior, and to estimate quantities that are hard to measure in laboratory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Invocatio: [hepreaa].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Separations using supercritical fluid chromatography (SFC) with packed columns have been re-discovered and explored in recent years. SFC enables fast and efficient separations and, in some cases, gives better results than high performance liquid chromatography (HPLC). This paper provides an overview of recent advances in SFC separations using packed columns for both achiral and chiral separations. The most important types of stationary phases used in SFC are discussed as well as the most critical parameters involved in the separations and some recent applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forest inventories are used to estimate forest characteristics and the condition of forest for many different applications: operational tree logging for forest industry, forest health state estimation, carbon balance estimation, land-cover and land use analysis in order to avoid forest degradation etc. Recent inventory methods are strongly based on remote sensing data combined with field sample measurements, which are used to define estimates covering the whole area of interest. Remote sensing data from satellites, aerial photographs or aerial laser scannings are used, depending on the scale of inventory. To be applicable in operational use, forest inventory methods need to be easily adjusted to local conditions of the study area at hand. All the data handling and parameter tuning should be objective and automated as much as possible. The methods also need to be robust when applied to different forest types. Since there generally are no extensive direct physical models connecting the remote sensing data from different sources to the forest parameters that are estimated, mathematical estimation models are of "black-box" type, connecting the independent auxiliary data to dependent response data with linear or nonlinear arbitrary models. To avoid redundant complexity and over-fitting of the model, which is based on up to hundreds of possibly collinear variables extracted from the auxiliary data, variable selection is needed. To connect the auxiliary data to the inventory parameters that are estimated, field work must be performed. In larger study areas with dense forests, field work is expensive, and should therefore be minimized. To get cost-efficient inventories, field work could partly be replaced with information from formerly measured sites, databases. The work in this thesis is devoted to the development of automated, adaptive computation methods for aerial forest inventory. The mathematical model parameter definition steps are automated, and the cost-efficiency is improved by setting up a procedure that utilizes databases in the estimation of new area characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lähikenttä- ja kaukokenttämikroskopian yhdistäminen: Uusi korkearesoluutioinen menetelmä nanokuvantamiseen. Osteoporoosi on sairaus, jossa luun uudistumisprosessi ei ole enää tasapainossa. Uuden luun muodostuminen on hitaampaa johtuen osteoblastien laskeneesta aktiivisuudesta. Yksi keino estää osteoporoosin syntyä on estää osteoklastien sitoutuminen luun pinnalle, jolloin ne eivät aloita luun syömisprosessia. Tämän Pro gradu -tutkielman tarkoituksena on luoda uusi työkalu osteoklastien sitoutumisen tutkimiseen samanaikaisesti fluoresenssi- ja atomivoimamikroskoopilla. Tätä tarkoitusta varten yhdistettiin atomivoimamikroskooppi sekä STED mikroskooppi. Kirjallisuuskatsauksessa käydään läpi yksityiskohtaisesti molempien mikroskooppitekniikoiden teoriat. Kokeellisessa osiossa esitetään käytetyt metodit ja alustavat tulokset uudella systeemillä. Lisäksi keskustellaan lyhyesti kuvan analysoinnista ImageJohjelmalla. Konfokaalisen fluoresenssimikroskoopin ja atomivoimamikroskoopin yhdistelmä on keksitty jo aikaisemmin, mutta tavallisen konfokaalimikroskoopin erottelukyvyn raja on noin 200 nanometriä johtuen valon diffraktioluonteesta. Yksityiskohdat eivät erotu, jos ne ovat pienempiä kuin puolet käytettävästä aallonpituudesta. STED mikroskooppi mahdollistaa fluoresenssikuvien taltioimisen solunsisäisistä prosesseista 50 nanometrin lateraalisella erotuskyvyllä ja atomivoimamikroskooppi antaa topografista tietoa näytteestä nanometrien erotuskyvyllä. Biologisia näytteitä kuvannettaessa atomivoimamikroskoopin erotuskyky kuitenkin huononee ja yleensä saavutetaan 30-50 nanometrin erotuskyky. Kuvien kerrostaminen vaatii vertauspisteitä ja tätä varten käytettiin atomivoimamikroskoopin kärjen tunnistamista ja referenssipartikkeleita. Kuva-analysointi suoritettiin ImageJ-kuvankäsittelyohjelmalla. Tuloksista nähdään, että referenssipartikkelit ovat hyviä, mutta niiden sijoittaminen tarkasti tietylle kohdealueelle on hankalaa nanoskaalassa. Tästä johtuen kärjen havaitseminen fluoresenssikuvassa on parempi metodi. Atomivoimamikroskoopin kärki voidaan päällystää fluoresoivalla aineella, mutta tämä lisää kärjen aiheuttamaa konvoluutiota mittausdataan. Myös valon takaisinsirontaa kärjestä voidaan tutkia, jolloin konvoluutio ei lisäänny. Ensimmäisten kuvien kerrostamisessa käytettiin hyväksi fluoresoivalla aineella päällystettyä kärkeä ja lopputuloksessa oli vain 50 nanometrin yhteensopimattomuus fluoresenssi- ja topografiakuvan kanssa. STED mikroskoopin avulla nähdään leimattujen proteiinien tarkat sijainnit tiettynä ajankohtana elävän solun sisällä. Samaan aikaan pystytään kuvantamaan solun fyysisiä muotoja tai mitata adheesiovoimia atomivoimamikroskoopilla. Lisäksi voidaan käyttää funktinalisoitua kärkeä, jolla voidaan laukaista signalointitapahtumia solun ja soluväliaineen välillä. Sitoutuminen soluväliaineeseen voidaan rekisteröidä samoin kuin adheesiomediaattorien sijainnit sitoutumisalueella. Nämä dynaamiset havainnot tuottavat uutta informaatiota solun signaloinnista, kun osteoklasti kiinnittyy luun pintaan. Tämä teknologia tarjoaa uuden näkökulman monimutkaisiin signalointiprosesseihin nanoskaalassa ja tulee ratkaisemaan lukemattoman määrän biologisia ongelmia.