969 resultados para Computer Structure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a mathematical model is derived via Lagrange's Equation for a shear building structure that acts as a foundation of a non-ideal direct current electric motor, controlled by a mass loose inside a circular carving. Non-ideal sources of vibrations of structures are those whose characteristics are coupled to the motion of the structure, not being a function of time only as in the ideal case. Thus, in this case, an additional equation of motion is written, related to the motor rotation, coupled to the equation describing the horizontal motion of the shear building. This kind of problem can lead to the so-called Sommerfeld effect: steady state frequencies of the motor will usually increase as more power (voltage) is given to it in a step-by-step fashion. When a resonance condition with the structure is reached, the better part of this energy is consumed to generate large amplitude vibrations of the foundation without sensible change of the motor frequency as before. If additional increase steps in voltage are made, one may reach a situation where the rotor will jump to higher rotation regimes, no steady states being stable in between. As a device of passive control of both large amplitude vibrations and the Sommerfeld effect, a scheme is proposed using a point mass free to bounce back and forth inside a circular carving in the suspended mass of the structure. Numerical simulations of the model are also presented Copyright © 2007 by ASME.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical modeling of the interaction among waves and coastal structures is a challenge due to the many nonlinear phenomena involved, such as, wave propagation, wave transformation with water depth, interaction among incident and reflected waves, run-up / run-down and wave overtopping. Numerical models based on Lagrangian formulation, like SPH (Smoothed Particle Hydrodynamics), allow simulating complex free surface flows. The validation of these numerical models is essential, but comparing numerical results with experimental data is not an easy task. In the present paper, two SPH numerical models, SPHysics LNEC and SPH UNESP, are validated comparing the numerical results of waves interacting with a vertical breakwater, with data obtained in physical model tests made in one of the LNEC's flume. To achieve this validation, the experimental set-up is determined to be compatible with the Characteristics of the numerical models. Therefore, the flume dimensions are exactly the same for numerical and physical model and incident wave characteristics are identical, which allows determining the accuracy of the numerical models, particularly regarding two complex phenomena: wave-breaking and impact loads on the breakwater. It is shown that partial renormalization, i.e. renormalization applied only for particles near the structure, seems to be a promising compromise and an original method that allows simultaneously propagating waves, without diffusion, and modeling accurately the pressure field near the structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identification and classification of overlapping nodes in networks are important topics in data mining. In this paper, a network-based (graph-based) semi-supervised learning method is proposed. It is based on competition and cooperation among walking particles in a network to uncover overlapping nodes by generating continuous-valued outputs (soft labels), corresponding to the levels of membership from the nodes to each of the communities. Moreover, the proposed method can be applied to detect overlapping data items in a data set of general form, such as a vector-based data set, once it is transformed to a network. Usually, label propagation involves risks of error amplification. In order to avoid this problem, the proposed method offers a mechanism to identify outliers among the labeled data items, and consequently prevents error propagation from such outliers. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method. © 2012 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human dentition is naturally translucent, opalescent and fluorescent. Differences between the level of fluorescence of tooth structure and restorative materials may result in distinct metameric properties and consequently perceptible disparate esthetic behavior, which impairs the esthetic result of the restorations, frustrating both patients and staff. In this study, we evaluated the level of fluorescence of different composites (Durafill in tones A2 (Du), Charisma in tones A2 (Ch), Venus in tone A2 (Ve), Opallis enamel and dentin in tones A2 (OPD and OPE), Point 4 in tones A2 (P4), Z100 in tones A2 ( Z1), Z250 in tones A2 (Z2), Te-Econom in tones A2 (TE), Tetric Ceram in tones A2 (TC), Tetric Ceram N in tones A1, A2, A4 (TN1, TN2, TN4), Four seasons enamel and dentin in tones A2 (and 4SD 4SE), Empress Direct enamel and dentin in tones A2 (EDE and EDD) and Brilliant in tones A2 (Br)). Cylindrical specimens were prepared, coded and photographed in a standardized manner with a Canon EOS digital camera (400 ISO, 2.8 aperture and 1/ 30 speed), in a dark environment under the action of UV light (25 W). The images were analyzed with the software ScanWhite©-DMC/Darwin systems. The results showed statistical differences between the groups (p < 0.05), and between these same groups and the average fluorescence of the dentition of young (18 to 25 years) and adults (40 to 45 years) taken as control. It can be concluded that: Composites Z100, Z250 (3M ESPE) and Point 4 (Kerr) do not match with the fluorescence of human dentition and the fluorescence of the materials was found to be affected by their own tone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo simulation methods were used in order to study the conformational properties of partially ionized polyelectrolyte chains with Debye-Hückel screening in 1:1 electrolyte solution at room temperature. Configurational properties such as the distributions of probability for the square end to end distances, for the square radii of gyration and for the angles between polyion bonds were investigated as a function of the chain ionization and the salt concentration. © 1993.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the important issues in establishing a fault tolerant connection in a wavelength division multiplexing optical network is computing a pair of disjoint working and protection paths and a free wavelength along the paths. While most of the earlier research focused only on computing disjoint paths, in this work we consider computing both disjoint paths and a free wavelength along the paths. The concept of dependent cost structure (DCS) of protection paths to enhance their resource sharing ability was proposed in our earlier work. In this work we extend the concept of DCS of protection paths to wavelength continuous networks. We formalize the problem of computing disjoint paths with DCS in wavelength continuous networks and prove that it is NP-complete. We present an iterative heuristic that uses a layered graph model to compute disjoint paths with DCS and identify a free wavelength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An antioxidant structure-activity study is carried out in this work with ten flavonoid compounds using quantum chemistry calculations with the functional of density theory method. According to the geometry obtained by using the B3LYP/6-31G(d) method, the HOMO, ionization potential, stabilization energies, and spin density distribution showed that the flavonol is the more antioxidant nucleus. The spin density contribution is determinant for the stability of the free radical. The number of resonance structures is related to the pi-type electron system. 3-hydroxyflavone is the basic antioxidant structure for the simplified flavonoids studied here. The electron abstraction is more favored in the molecules where ether group and 3-hydroxyl are present, nonetheless 2,3-double bond and carbonyl moiety are facultative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug discovery has moved toward more rational strategies based on our increasing understanding of the fundamental principles of protein-ligand interactions. Structure( SBDD) and ligand-based drug design (LBDD) approaches bring together the most powerful concepts in modern chemistry and biology, linking medicinal chemistry with structural biology. The definition and assessment of both chemical and biological space have revitalized the importance of exploring the intrinsic complementary nature of experimental and computational methods in drug design. Major challenges in this field include the identification of promising hits and the development of high-quality leads for further development into clinical candidates. It becomes particularly important in the case of neglected tropical diseases (NTDs) that affect disproportionately poor people living in rural and remote regions worldwide, and for which there is an insufficient number of new chemical entities being evaluated owing to the lack of innovation and R&D investment by the pharmaceutical industry. This perspective paper outlines the utility and applications of SBDD and LBDD approaches for the identification and design of new small-molecule agents for NTDs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aldolase has emerged as a promising molecular target for the treatment of human African trypanosomiasis. Over the last years, due to the increasing number of patients infected with Trypanosoma brucei, there is an urgent need for new drugs to treat this neglected disease. In the present study, two-dimensional fragment-based quantitative-structure activity relationship (QSAR) models were generated for a series of inhibitors of aldolase. Through the application of leave-one-out and leave-many-out cross-validation procedures, significant correlation coefficients were obtained (r(2) = 0.98 and q(2) = 0.77) as an indication of the statistical internal and external consistency of the models. The best model was employed to predict pK(i) values for a series of test set compounds, and the predicted values were in good agreement with the experimental results, showing the power of the model for untested compounds. Moreover, structure-based molecular modeling studies were performed to investigate the binding mode of the inhibitors in the active site of the parasitic target enzyme. The structural and QSAR results provided useful molecular information for the design of new aldolase inhibitors within this structural class.