23 resultados para Euler number, Irreducible symplectic manifold, Lagrangian fibration, Moduli space
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Conferência: IEEE 24th International Conference on Application-Specific Systems, Architectures and Processors (ASAP)- Jun 05-07, 2013
Resumo:
Facing the lateral vibration problem of a machine rotor as a beam on elastic supports in bending, the authors deal with the free vibration of elastically restrained Bernoulli-Euler beams carrying a finite number of concentrated elements along their length. Based on Rayleigh's quotient, an iterative strategy is developed to find the approximated torsional stiffness coefficients, which allows the reconciliation between the theoretical model results and the experimental ones, obtained through impact tests. The mentioned algorithm treats the vibration of continuous beams under a determined set of boundary and continuity conditions, including different torsional stiffness coefficients and the effect of attached concentrated masses and rotational inertias, not only in the energetic terms of the Rayleigh's quotient but also on the mode shapes, considering the shape functions defined in branches. Several loading cases are examined and examples are given to illustrate the validity of the model and accuracy of the obtained natural frequencies.
Resumo:
In this paper, a novel ROM-less RNS-to-binary converter is proposed, using a new balanced moduli set {22n-1, 22n + 1, 2n-3, 2n + 3} for n even. The proposed converter is implemented with a two stage ROM-less approach, which computes the value of X based only in arithmetic operations, without using lookup tables. Experimental results for 24 to 120 bits of Dynamic Range, show that the proposed converter structure allows a balanced system with 20% faster arithmetic channels regarding the related state of the art, while requiring similar area resources. This improvement in the channel's performance is enough to offset the higher conversion costs of the proposed converter. Furthermore, up to 20% better Power-Delay-Product efficiency metric can be achieved for the full RNS architecture using the proposed moduli set. © 2014 IEEE.
Resumo:
Formaldehyde (FA) the most simple and reactive of all aldehydes, is a colorless, reactive and readily polymerizing gas at normal temperature. It has a pungent, suffocating odour that is recognized by most human subjects at concentrations below 1ppm. According to the Report on Carcinogens, FA ranks 25th in the overall U.S. chemical production with more than 11 billion pounds (5 million tons) produced each year. Is an important industrial compound that is used in the manufacture of synthetic resins and chemical compounds such as lubricants and adhesives. It has also applications as a disinfectant, preservative and is used in cosmetics. Estimates of the number of persons who are occupationally exposed to FA indicate that, at least at low levels, may occur in a wide variety of industries. The occupational settings with most extensive use of formaldehyde is in the production of resins and in anatomy and pathology laboratories. Several studies reported a carcinogenic effect in humans after inhalation of FA, in particular an increased risk for nasopharyngeal cancer. Nowadays, the International Agency for Research on Cancer (IARC) classifies FA as carcinogenic to humans (group 1), on the basis of sufficient evidence in humans and sufficient evidence in experimental animals. Manifold in vitro studies clearly indicated that FA is genotoxic. FA induced various genotoxic effects in proliferatin cultured mammalian cells. A variety of evidence suggests that the primary DNA alterations after FA exposure are DNA-protein crosslinks. Incomplete repair of DPX can lead to the formation of mutations.
Resumo:
Formaldehyde (CH2O) the most simple and reactive of all aldehydes, is a colorless, reactive and readily polymerizing gas at normal temperature. It has a pungent, suffocating odour that is recognized by most human subjects at concentrations below 1 ppm. According to the Report on Carcinogens, formaldehyde (FA) ranks 25th in the overall U.S. chemical production with more than 11 billion pounds (5 million tons) produced each year. Is an important industrial compound that is used in the manufacture of synthetic resins and chemical compounds such as lubricants and adhesives. It has also applications as a disinfectant, preservative and is used in cosmetics. Estimates of the number of persons who are occupationally exposed to FA indicate that, at least at low levels, may occur in a wide variety of industries. The occupational settings with most extensive use of formaldehyde is in the production of resins and in anatomy and pathology laboratories. Several studies reported a carcinogenic effect in humans after inhalation of FA, in particular an increased risk for nasopharyngeal cancer. Nowadays, the International Agency for Research on Cancer (IARC) classifies FA as carcinogenic to humans (group 1), on the basis of sufficient evidence in humans and sufficient evidence in experimental animals. Manifold in vitro studies clearly indicated that FA is genotoxic. FA induced various genotoxic effects in proliferatin cultured mammalian cells. A variety of evidence suggests that the primary DNA alterations after FA exposure are DNA-protein crosslinks (DPX). Incomplete repair of DPX can lead to the formation of mutations.
Resumo:
This paper is an elaboration of the DECA algorithm [1] to blindly unmix hyperspectral data. The underlying mixing model is linear, meaning that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. The proposed method, as DECA, is tailored to highly mixed mixtures in which the geometric based approaches fail to identify the simplex of minimum volume enclosing the observed spectral vectors. We resort then to a statitistical framework, where the abundance fractions are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. With respect to DECA, we introduce two improvements: 1) the number of Dirichlet modes are inferred based on the minimum description length (MDL) principle; 2) The generalized expectation maximization (GEM) algorithm we adopt to infer the model parameters is improved by using alternating minimization and augmented Lagrangian methods to compute the mixing matrix. The effectiveness of the proposed algorithm is illustrated with simulated and read data.
Resumo:
We have performed Surface Evolver simulations of two-dimensional hexagonal bubble clusters consisting of a central bubble of area lambda surrounded by s shells or layers of bubbles of unit area. Clusters of up to twenty layers have been simulated, with lambda varying between 0.01 and 100. In monodisperse clusters (i.e., for lambda = 1) [M.A. Fortes, F Morgan, M. Fatima Vaz, Philos. Mag. Lett. 87 (2007) 561] both the average pressure of the entire Cluster and the pressure in the central bubble are decreasing functions of s and approach 0.9306 for very large s, which is the pressure in a bubble of an infinite monodisperse honeycomb foam. Here we address the effect of changing the central bubble area lambda. For small lambda the pressure in the central bubble and the average pressure were both found to decrease with s, as in monodisperse clusters. However, for large,, the pressure in the central bubble and the average pressure increase with s. The average pressure of large clusters was found to be independent of lambda and to approach 0.9306 asymptotically. We have also determined the cluster surface energies given by the equation of equilibrium for the total energy in terms of the area and the pressure in each bubble. When the pressures in the bubbles are not available, an approximate equation derived by Vaz et al. [M. Fatima Vaz, M.A. Fortes, F. Graner, Philos. Mag. Lett. 82 (2002) 575] was shown to provide good estimations for the cluster energy provided the bubble area distribution is narrow. This approach does not take cluster topology into account. Using this approximate equation, we find a good correlation between Surface Evolver Simulations and the estimated Values of energies and pressures. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Microbial adhesion is a field of recognized relevance and, as such, an impressive array of tools has been developed to understand its molecular mechanisms and ultimately for its quantification. Some of the major limitations found within these methodologies concern the incubation time, the small number of cells analyzed, and the operator's subjectivity. To overcome these aspects, we have developed a quantitative method to measure yeast cells' adhesion through flow cytometry. In this methodology, a suspension of yeast cells is mixed with green fluorescent polystyrene microspheres (uncoated or coated with host proteins). Within 2 h, an adhesion profile is obtained based on two parameters: percentage and cells-microsphere population's distribution pattern. This flow cytometry protocol represents a useful tool to quantify yeast adhesion to different substrata in a large scale, providing manifold data in a speedy and informative manner.
Resumo:
Myocardial Perfusion Gated Single Photon Emission Tomography (Gated-SPET) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV) function. But standard protocols of the Gated-SPECT studies require long acquisition times for each study. It is therefore important to reduce as much as possible the total duration of image acquisition. However, it is known that this reduction leads to decrease on counts statistics per projection and raises doubts about the validity of the functional parameters determined by Gated-SPECT. Considering that, it’s difficult to carry out this analysis in real patients. For ethical, logistical and economical matters, simulated studies could be required for this analysis. Objective: Evaluate the influence of the total number of counts acquired from myocardium, in the calculation of myocardial functional parameters (LVEF – left ventricular ejection fraction, EDV – end-diastolic volume, ESV – end-sistolic volume) using routine software procedures.
Resumo:
Industrial rotating machines may be exposed to severe dynamic excitations due to resonant working regimes. Dealing with the bending vibration, problem of a machine rotor, the shaft - and attached discs - can be simply modelled using the Bernoulli-Euler beam theory, as a continuous beam subjected to a specific set of boundary conditions. In this study, the authors recall Rayleigh's method to propose an iterative strategy, which allows for the determination of natural frequencies and mode shapes of continuous beams taking into account the effect of attached concentrated masses and rotational inertias, including different stiffness coefficients at the right and the left end sides. The algorithm starts with the exact solutions from Bernoulli-Euler's beam theory, which are then updated through Rayleigh's quotient parameters. Several loading cases are examined in comparison with the experimental data and examples are presented to illustrate the validity of the model and the accuracy of the obtained values.
Resumo:
Cluster analysis for categorical data has been an active area of research. A well-known problem in this area is the determination of the number of clusters, which is unknown and must be inferred from the data. In order to estimate the number of clusters, one often resorts to information criteria, such as BIC (Bayesian information criterion), MML (minimum message length, proposed by Wallace and Boulton, 1968), and ICL (integrated classification likelihood). In this work, we adopt the approach developed by Figueiredo and Jain (2002) for clustering continuous data. They use an MML criterion to select the number of clusters and a variant of the EM algorithm to estimate the model parameters. This EM variant seamlessly integrates model estimation and selection in a single algorithm. For clustering categorical data, we assume a finite mixture of multinomial distributions and implement a new EM algorithm, following a previous version (Silvestre et al., 2008). Results obtained with synthetic datasets are encouraging. The main advantage of the proposed approach, when compared to the above referred criteria, is the speed of execution, which is especially relevant when dealing with large data sets.
Resumo:
Myocardial Perfusion Gated Single Photon Emission Tomography (Gated-SPET) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV). The purpose of this study is to evaluate the influence of the total number of counts acquired from myocardium, in the calculation of myocardial functional parameters using routine software procedures. Methods: Gated-SPET studies were simulated using Monte Carlo GATE package and NURBS phantom. Simulated data were reconstructed and processed using the commercial software package Quantitative Gated-SPECT. The Bland-Altman and Mann-Whitney-Wilcoxon tests were used to analyze the influence of the number of total counts in the calculation of LV myocardium functional parameters. Results: In studies simulated with 3MBq in the myocardium there were significant differences in the functional parameters: Left ventricular ejection fraction (LVEF), end-systolic volume (ESV), Motility and Thickness; between studies acquired with 15s/projection and 30s/projection. Simulations with 4.2MBq show significant differences in LVEF, end-diastolic volume (EDV) and Thickness. Meanwhile in the simulations with 5.4MBq and 8.4MBq the differences were statistically significant for Motility and Thickness. Conclusion: The total number of counts per simulation doesn't significantly interfere with the determination of Gated-SPET functional parameters using the administered average activity of 450MBq to 5.4MBq in myocardium.
Resumo:
Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.
Resumo:
Myocardial perfusion gated-single photon emission computed tomography (gated-SPECT) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV) function. The aim of this study is to analyze the influence of counts/pixel and concomitantly the total counts in the myocardium for the calculation of myocardial functional parameters. Material and methods: Gated-SPECT studies were performed using a Monte Carlo GATE simulation package and the NCAT phantom. The simulations of these studies use the radiopharmaceutical 99mTc-labeled tracers (250, 350, 450 and 680MBq) for standard patient types, effectively corresponding to the following activities of myocardium: 3, 4.2, 5.4-8.2MBq. All studies were simulated using 15 and 30s/projection. The simulated data were reconstructed and processed by quantitative-gated-SPECT software, and the analysis of functional parameters in gated-SPECT images was done by using Bland-Altman test and Mann-Whitney-Wilcoxon test. Results: In studies simulated using different times (15 and 30s/projection), it was noted that for the activities for full body: 250 and 350MBq, there were statistically significant differences in parameters Motility and Thickness. For the left ventricular ejection fraction (LVEF), end-systolic volume (ESV) it was only for 250MBq, and 350MBq in the end-diastolic volume (EDV), while the simulated studies with 450 and 680MBq showed no statistically significant differences for global functional parameters: LVEF, EDV and ESV. Conclusion: The number of counts/pixel and, concomitantly, the total counts per simulation do not significantly interfere with the determination of gated-SPECT functional parameters, when using the administered average activity of 450MBq, corresponding to the 5.4MBq of the myocardium, for standard patient types.
Resumo:
This paper proposes an efficient scalable Residue Number System (RNS) architecture supporting moduli sets with an arbitrary number of channels, allowing to achieve larger dynamic range and a higher level of parallelism. The proposed architecture allows the forward and reverse RNS conversion, by reusing the arithmetic channel units. The arithmetic operations supported at the channel level include addition, subtraction, and multiplication with accumulation capability. For the reverse conversion two algorithms are considered, one based on the Chinese Remainder Theorem and the other one on Mixed-Radix-Conversion, leading to implementations optimized for delay and required circuit area. With the proposed architecture a complete and compact RNS platform is achieved. Experimental results suggest gains of 17 % in the delay in the arithmetic operations, with an area reduction of 23 % regarding the RNS state of the art. When compared with a binary system the proposed architecture allows to perform the same computation 20 times faster alongside with only 10 % of the circuit area resources.