245 resultados para RESEARCH SCIENTIFIC


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compliant mechanisms can achieve a specified motion as a mechanism without relying on the use of joints and pins. They have broad application in precision mechanical devices and Micro-Electro Mechanical Systems (MEMS) but may lose accuracy and produce undesirable displacements when subjected to temperature changes. These undesirable effects can be reduced by using sensors in combination with control techniques and/or by applying special design techniques to reduce such undesirable effects at the design stage, a process generally termed ""design for precision"". This paper describes a design for precision method based on a topology optimization method (TOM) for compliant mechanisms that includes thermal compensation features. The optimization problem emphasizes actuator accuracy and it is formulated to yield optimal compliant mechanism configurations that maximize the desired output displacement when a force is applied, while minimizing undesirable thermal effects. To demonstrate the effectiveness of the method, two-dimensional compliant mechanisms are designed considering thermal compensation, and their performance is compared with compliant mechanisms designs that do not consider thermal compensation. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensors and actuators based on piezoelectric plates have shown increasing demand in the field of smart structures, including the development of actuators for cooling and fluid-pumping applications and transducers for novel energy-harvesting devices. This project involves the development of a topology optimization formulation for dynamic design of piezoelectric laminated plates aiming at piezoelectric sensors, actuators and energy-harvesting applications. It distributes piezoelectric material over a metallic plate in order to achieve a desired dynamic behavior with specified resonance frequencies, modes, and enhanced electromechanical coupling factor (EMCC). The finite element employs a piezoelectric plate based on the MITC formulation, which is reliable, efficient and avoids the shear locking problem. The topology optimization formulation is based on the PEMAP-P model combined with the RAMP model, where the design variables are the pseudo-densities that describe the amount of piezoelectric material at each finite element and its polarization sign. The design problem formulated aims at designing simultaneously an eigenshape, i.e., maximizing and minimizing vibration amplitudes at certain points of the structure in a given eigenmode, while tuning the eigenvalue to a desired value and also maximizing its EMCC, so that the energy conversion is maximized for that mode. The optimization problem is solved by using sequential linear programming. Through this formulation, a design with enhancing energy conversion in the low-frequency spectrum is obtained, by minimizing a set of first eigenvalues, enhancing their corresponding eigenshapes while maximizing their EMCCs, which can be considered an approach to the design of energy-harvesting devices. The implementation of the topology optimization algorithm and some results are presented to illustrate the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical impedance tomography (EIT) captures images of internal features of a body. Electrodes are attached to the boundary of the body, low intensity alternating currents are applied, and the resulting electric potentials are measured. Then, based on the measurements, an estimation algorithm obtains the three-dimensional internal admittivity distribution that corresponds to the image. One of the main goals of medical EIT is to achieve high resolution and an accurate result at low computational cost. However, when the finite element method (FEM) is employed and the corresponding mesh is refined to increase resolution and accuracy, the computational cost increases substantially, especially in the estimation of absolute admittivity distributions. Therefore, we consider in this work a fast iterative solver for the forward problem, which was previously reported in the context of structural optimization. We propose several improvements to this solver to increase its performance in the EIT context. The solver is based on the recycling of approximate invariant subspaces, and it is applied to reduce the EIT computation time for a constant and high resolution finite element mesh. In addition, we consider a powerful preconditioner and provide a detailed pseudocode for the improved iterative solver. The numerical results show the effectiveness of our approach: the proposed algorithm is faster than the preconditioned conjugate gradient (CG) algorithm. The results also show that even on a standard PC without parallelization, a high mesh resolution (more than 150,000 degrees of freedom) can be used for image estimation at a relatively low computational cost. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Piezoresistive materials, materials whose resistivity properties change when subjected to mechanical stresses, are widely utilized in many industries as sensors, including pressure sensors, accelerometers, inclinometers, and load cells. Basic piezoresistive sensors consist of piezoresistive devices bonded to a flexible structure, such as a cantilever or a membrane, where the flexible structure transmits pressure, force, or inertial force due to acceleration, thereby causing a stress that changes the resistivity of the piezoresistive devices. By applying a voltage to a piezoresistive device, its resistivity can be measured and correlated with the amplitude of an applied pressure or force. The performance of a piezoresistive sensor is closely related to the design of its flexible structure. In this research, we propose a generic topology optimization formulation for the design of piezoresistive sensors where the primary aim is high response. First, the concept of topology optimization is briefly discussed. Next, design requirements are clarified, and corresponding objective functions and the optimization problem are formulated. An optimization algorithm is constructed based on these formulations. Finally, several design examples of piezoresistive sensors are presented to confirm the usefulness of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A series of new phenyl-based conjugated copolymers has been synthesized and investigated by vibrational and photoluminescence spectroscopy (PL). The materials are: poly( 1,4-phenylene-alt-3,6-pyridazine) (COP-PIR), poly(9,9-dioctylfluorene)-co-quaterphenylene (COP-PPP) and poly[(1,4-phenylene-alt-3,6-pyridazine)-co-(1,4-phenylene-alt-9,9-dioctylfluorene)] (COP-PIR-FLUOR), with 3.5% of fluorene. COP-PPP and COP-PIR-FLUOR have high fluorescence quantum yields in solution. Infrared and Raman spectra were used to check the chemical structure of the compounds. The copolymers exhibit blue emission ranging front 2.8 to 3.6 eV when excited at E(exc)=4.13 eV. Stokes-shift Values were estimated on pristine samples in their condensed state from steady-state PL-emission and PL-excitation spectra. They suggest a difference in the torsional angle between the molecular configuration of the polymer blocks at the absorption and PL transitions and also in the photoexcitation diffusion. Additionally, the time-resolved PL of these materials has been investigated by using 100 fs laser pulses at E(exc)=4.64 eV and a streak camera. Results show very fast biexponential kinetics for the two fluorene-based polymers with decay times below 300 ps indicating both intramolecular, fast radiative recombination and migration of photogenerated electron-hole pairs. By contrast, the PL of COP-PIR is less intense and longer lived, indicating that excitons are confined to the chains in this polymer. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thermodynamic properties of bread dough (fusion enthalpy, apparent specific heat, initial freezing point and unfreezable water) were measured at temperatures from -40 degrees C to 35 degrees C using differential scanning calorimetry. The initial freezing point was also calculated based on the water activity of dough. The apparent specific heat varied as a function of temperature: specific heat in the freezing region varied from (1.7-23.1) J g(-1) degrees C(-1), and was constant at temperatures above freezing (2.7 J g(-1) degrees C(-1)). Unfreezable water content varied from (0.174-0.182) g/g of total product. Values of heat capacity as a function of temperature were correlated using thermodynamic models. A modification for low-moisture foodstuffs (such as bread dough) was successfully applied to the experimental data. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activation of the cephalosporin side-chain precursor to the corresponding CoA-thioester is an essential step for its incorporation into the P-lactam backbone. To identify an acyl-CoA ligase involved in activation of adipate, we searched in the genome database of Penicillium chrysogenum for putative structural genes encoding acyl-CoA ligases. Chemostat-based transcriptome analysis was used to identify the one presenting the highest expression level when cells were grown in the presence of adipate. Deletion of the gene renamed aclA, led to a 32% decreased specific rate of adipate consumption and a threefold reduction of adipoyl-6-aminopenicillanic acid levels, but did not affect penicillin V production. After overexpression in Escherichia coli, the purified protein was shown to have a broad substrate range including adipate. Finally, protein-fusion with cyan-fluorescent protein showed co-localization with microbody-borne acyl-transferase. Identification and functional characterization of aclA may aid in developing future metabolic engineering strategies for improving the production of different cephalosporins. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Penicillium chrysogenum is widely used as an industrial antibiotic producer, in particular in the synthesis of g-lactam antibiotics such as penicillins and cephalosporins. In industrial processes, oxalic acid formation leads to reduced product yields. Moreover, precipitation of calcium oxalate complicates product recovery. We observed oxalate production in glucose-limited chemostat cultures of P. chrysogenum grown with or without addition of adipic acid, side-chain of the cephalosporin precursor adipoyl-6-aminopenicillinic acid (ad-6-APA). Oxalate accounted for up to 5% of the consumed carbon source. In filamentous fungi, oxaloacetate hydrolase (OAH; EC3.7.1.1) is generally responsible for oxalate production. The P. chrysogenum genome harbours four orthologs of the A. niger oahA gene. Chemostat-based transcriptome analyses revealed a significant correlation between extracellular oxalate titers and expression level of the genes Pc18g05100 and Pc22g24830. To assess their possible involvement in oxalate production, both genes were cloned in Saccharomyces cerevisiae, yeast that does not produce oxalate. Only the expression of Pc22g24830 led to production of oxalic acid in S. cerevisiae. Subsequent deletion of Pc22g28430 in P. chrysogenum led to complete elimination of oxalate production, whilst improving yields of the cephalosporin precursor ad-6-APA. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The influence of guar and xanthan gum and their combined use on dough proofing rate and its calorimetric properties was investigated. Fusion enthalpy, which is related to the amount of frozen water, was influenced by frozen dough formulation and storage time; specifically gum addition reduced the fusion enthalpy in comparison to control formulation, 76.9 J/g for formulation with both gums and 81.2 J/g for control, at 28th day. Other calorimetric parameters, such as T(g) and freezable water amount, were also influenced by frozen storage time. For all formulations, proofing rate of dough after freezing, frozen storage time and thawing, decreased in comparison to non-frozen dough, indicating that the freezing process itself was more detrimental to the proofing rate than storage time. For all formulations, the mean value of proofing rate was 2.97 +/- 0.24 cm(3) min(-1) per 100 g of non-frozen dough and 2.22 +/- 0.12 cm(3) min(-1) per 100 g of frozen dough. Also the proofing rate of non-frozen dough with xanthan gum decreased significantly in relation to dough without gums and dough with only guar gum. Optical microscopy analyses showed that the gas cell production after frozen storage period was reduced, which is in agreement with the proofing rate results. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We try to shed some light oil the question of wily technology-intensive businesses often fail in less-developed countries and under what circumstances they are likely to be a Success from the perspective of both domestic and export markets. The answers were drawn from a set of empirical evidences from Brazilian firms applying photonics technologies. Sonic of the issues faced by them are related to the question of state versus private initiative, entering traditional versus niche market, and technology transfer versus product development management. In overall, we concluded that weakness of the institutions and inadequacy of social and organizational demography play a key role in explaining to a large extent wily countries differ in technological development and diffusion. In this context, we point out obstacles, which must be removed in order to make public policies and firm`s achievements more efficient. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the non-preemptive single machine scheduling problem to minimize total tardiness. We are interested in the online version of this problem, where orders arrive at the system at random times. Jobs have to be scheduled without knowledge of what jobs will come afterwards. The processing times and the due dates become known when the order is placed. The order release date occurs only at the beginning of periodic intervals. A customized approximate dynamic programming method is introduced for this problem. The authors also present numerical experiments that assess the reliability of the new approach and show that it performs better than a myopic policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The classical approach for acoustic imaging consists of beamforming, and produces the source distribution of interest convolved with the array point spread function. This convolution smears the image of interest, significantly reducing its effective resolution. Deconvolution methods have been proposed to enhance acoustic images and have produced significant improvements. Other proposals involve covariance fitting techniques, which avoid deconvolution altogether. However, in their traditional presentation, these enhanced reconstruction methods have very high computational costs, mostly because they have no means of efficiently transforming back and forth between a hypothetical image and the measured data. In this paper, we propose the Kronecker Array Transform ( KAT), a fast separable transform for array imaging applications. Under the assumption of a separable array, it enables the acceleration of imaging techniques by several orders of magnitude with respect to the fastest previously available methods, and enables the use of state-of-the-art regularized least-squares solvers. Using the KAT, one can reconstruct images with higher resolutions than was previously possible and use more accurate reconstruction techniques, opening new and exciting possibilities for acoustic imaging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Part I [""Fast Transforms for Acoustic Imaging-Part I: Theory,"" IEEE TRANSACTIONS ON IMAGE PROCESSING], we introduced the Kronecker array transform (KAT), a fast transform for imaging with separable arrays. Given a source distribution, the KAT produces the spectral matrix which would be measured by a separable sensor array. In Part II, we establish connections between the KAT, beamforming and 2-D convolutions, and show how these results can be used to accelerate classical and state of the art array imaging algorithms. We also propose using the KAT to accelerate general purpose regularized least-squares solvers. Using this approach, we avoid ill-conditioned deconvolution steps and obtain more accurate reconstructions than previously possible, while maintaining low computational costs. We also show how the KAT performs when imaging near-field source distributions, and illustrate the trade-off between accuracy and computational complexity. Finally, we show that separable designs can deliver accuracy competitive with multi-arm logarithmic spiral geometries, while having the computational advantages of the KAT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: We aimed to evaluate if the co-localisation of calcium and necrosis in intravascular ultrasound virtual histology (IVUS-VH) is due to artefact, and whether this effect can be mathematically estimated. Methods and results: We hypothesised that, in case calcium induces an artefactual coding of necrosis, any addition in calcium content would generate an artificial increment in the necrotic tissue. Stent struts were used to simulate the ""added calcium"". The change in the amount and in the spatial localisation of necrotic tissue was evaluated before and after stenting (n=17 coronary lesions) by means of a especially developed imaging software. The area of ""calcium"" increased from a median of 0.04 mm(2) at baseline to 0.76 mm(2) after stenting (p<0.01). In parallel the median necrotic content increased from 0.19 mm(2) to 0.59 mm(2) (p<0.01). The ""added"" calcium strongly predicted a proportional increase in necrosis-coded tissue in the areas surrounding the calcium-like spots (model R(2)=0.70; p<0.001). Conclusions: Artificial addition of calcium-like elements to the atherosclerotic plaque led to an increase in necrotic tissue in virtual histology that is probably artefactual. The overestimation of necrotic tissue by calcium strictly followed a linear pattern, indicating that it may be amenable to mathematical correction.