928 resultados para computational models


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays there is great interest in damage identification using non destructive tests. Predictive maintenance is one of the most important techniques that are based on analysis of vibrations and it consists basically of monitoring the condition of structures or machines. A complete procedure should be able to detect the damage, to foresee the probable time of occurrence and to diagnosis the type of fault in order to plan the maintenance operation in a convenient form and occasion. In practical problems, it is frequent the necessity of getting the solution of non linear equations. These processes have been studied for a long time due to its great utility. Among the methods, there are different approaches, as for instance numerical methods (classic), intelligent methods (artificial neural networks), evolutions methods (genetic algorithms), and others. The characterization of damages, for better agreement, can be classified by levels. A new one uses seven levels of classification: detect the existence of the damage; detect and locate the damage; detect, locate and quantify the damages; predict the equipment's working life; auto-diagnoses; control for auto structural repair; and system of simultaneous control and monitoring. The neural networks are computational models or systems for information processing that, in a general way, can be thought as a device black box that accepts an input and produces an output. Artificial neural nets (ANN) are based on the biological neural nets and possess habilities for identification of functions and classification of standards. In this paper a methodology for structural damages location is presented. This procedure can be divided on two phases. The first one uses norms of systems to localize the damage positions. The second one uses ANN to quantify the severity of the damage. The paper concludes with a numerical application in a beam like structure with five cases of structural damages with different levels of severities. The results show the applicability of the presented methodology. A great advantage is the possibility of to apply this approach for identification of simultaneous damages.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this paper is to utilize the SIPOC, flowchart and IDEF0 modeling techniques combined to elaborate the conceptual model of a simulation project. It is intended to identify the contribution of these techniques in the elaboration of the computational model. To illustrate such application, a practical case of a high-end technology enterprise is presented. The paper concludes that the proposed approach eases the elaboration of the computational model. © 2008 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The structure of an ecological community is shaped by several temporally varying mechanisms. Such mechanisms depend in a large extent on species interactions, which are themselves manifestations of the community's own structure. Dynamics and structure are then mutually determined. The assembly models are mathematical or computational models which simulate the dynamics of ecological communities resulting from a historical balance among colonizations and local extinctions, by means of sequential species introductions and their interactions with resident species. They allow analyzing that double relationship between structure and dynamics, recognizing its temporal dependence. It is assumed two spatiotemporal scales: (i) a local scale, where species co-occur and have their dynamics explicitly simulated and (ii) a regional scale without dynamics, representing the external environment which the potential colonizers come from. The mathematical and computational models used to simulate the local dynamics are quite variable, being distinguished according to the complexity mode of population representation, including or not intra or interspecific differences. They determine the community state, in terms of abundances, interactions, and extinctions between two successive colonization attempts. The schedules of species introductions also follow diverse (although arbitrary) rules, which vary qualitatively with respect to species appearance mode, whether by speciation or by immigration, and quantitatively with respect to their rates of introduction into the community. Combining these criteria arises a great range of approaches for assembly models, each with its own limitations and questions, but contributing in a complementary way to elucidate the mechanisms structuring natural communities. To present such approaches, still incipient as research fields in Brazil, to describe some methods of analysis and to discuss the implications of their assumptions for the understanding of ecological patterns are the objectives of the present review.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Because the biomechanical behavior of dental implants is different from that of natural tooth, clinical problems may occur. The mechanism of stress distribution and load transfer to the implant/bone interface is a critical issue affecting the success rate of implants. Therefore, the aim of this study was to conduct a brief literature review of the available stress analysis methods to study implant-supported prosthesis loading and to discuss their contributions in the biomechanical evaluation of oral rehabilitation with implants. Several studies have used experimental, analytical, and computational models by means of finite element models (FEM), photoelasticity, strain gauges and associations of these methods to evaluate the biomechanical behavior of dental implants. The FEM has been used to evaluate new components, configurations, materials, and shapes of implants. The greatest advantage of the photoelastic method is the ability to visualize the stresses in complex structures, such as oral structures, and to observe the stress patterns in the whole model, allowing the researcher to localize and quantify the stress magnitude. Strain gauges can be used to assess in vivo and in vitro stress in prostheses, implants, and teeth. Some authors use the strain gauge technique with photoelasticity or FEM techniques. These methodologies can be widely applied in dentistry, mainly in the research field. Therefore, they can guide further research and clinical studies by predicting some disadvantages and streamlining clinical time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O presente trabalho visa a fornecer uma contribuição ao estudo dos perfis formados a frio sob altas temperaturas, em conseqüência da deflagração de um incêndio. Especificamente, abordam–se assuntos inerentes ao fenômeno da transferência de calor em paredes do tipo steel frame – dry wall com ou sem isolamento térmico na cavidade. Para tanto, propõem–se modelos computacionais capazes de fornecer, com certa precisão, o valor de temperatura em qualquer ponto do sistema estudado. Dessa forma, é possível, então, traçar configurações de distribuição de temperatura (uniforme ou não–uniforme) na seção transversal dos montantes que constituem o painel, fornecendo subsídios para análise de estabilidade e pós–flambagem dos elementos estruturais em questão. As simulações numéricas de transferência de calor são efetuadas com auxílio dos programas computacionais ABAQUS e SAFIR, ambos baseados no método dos elementos finitos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Within cognitive neuroscience, computational models are designed to provide insights into the organization of behavior while adhering to neural principles. These models should provide sufficient specificity to generate novel predictions while maintaining the generality needed to capture behavior across tasks and/or time scales. This paper presents one such model, the Dynamic Field Theory (DFT) of spatial cognition, showing new simulations that provide a demonstration proof that the theory generalizes across developmental changes in performance in four tasks—the Piagetian A-not-B task, a sandbox version of the A-not-B task, a canonical spatial recall task, and a position discrimination task. Model simulations demonstrate that the DFT can accomplish both specificity—generating novel, testable predictions—and generality—spanning multiple tasks across development with a relatively simple developmental hypothesis. Critically, the DFT achieves generality across tasks and time scales with no modification to its basic structure and with a strong commitment to neural principles. The only change necessary to capture development in the model was an increase in the precision of the tuning of receptive fields as well as an increase in the precision of local excitatory interactions among neurons in the model. These small quantitative changes were sufficient to move the model through a set of quantitative and qualitative behavioral changes that span the age range from 8 months to 6 years and into adulthood. We conclude by considering how the DFT is positioned in the literature, the challenges on the horizon for our framework, and how a dynamic field approach can yield new insights into development from a computational cognitive neuroscience perspective.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sudden cardiac death due to ventricular arrhythmia is one of the leading causes of mortality in the world. In the last decades, it has proven that anti-arrhythmic drugs, which prolong the refractory period by means of prolongation of the cardiac action potential duration (APD), play a good role in preventing of relevant human arrhythmias. However, it has long been observed that the “class III antiarrhythmic effect” diminish at faster heart rates and that this phenomenon represent a big weakness, since it is the precise situation when arrhythmias are most prone to occur. It is well known that mathematical modeling is a useful tool for investigating cardiac cell behavior. In the last 60 years, a multitude of cardiac models has been created; from the pioneering work of Hodgkin and Huxley (1952), who first described the ionic currents of the squid giant axon quantitatively, mathematical modeling has made great strides. The O’Hara model, that I employed in this research work, is one of the modern computational models of ventricular myocyte, a new generation began in 1991 with ventricular cell model by Noble et al. Successful of these models is that you can generate novel predictions, suggest experiments and provide a quantitative understanding of underlying mechanism. Obviously, the drawback is that they remain simple models, they don’t represent the real system. The overall goal of this research is to give an additional tool, through mathematical modeling, to understand the behavior of the main ionic currents involved during the action potential (AP), especially underlining the differences between slower and faster heart rates. In particular to evaluate the rate-dependence role on the action potential duration, to implement a new method for interpreting ionic currents behavior after a perturbation effect and to verify the validity of the work proposed by Antonio Zaza using an injected current as a perturbing effect.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis explores ways to formalize the legal knowledge concerning the public procurement domain by means of ontological patterns suitable, on one hand, to support awarding authorities in conducting procurement procedures and, on the other hand, to help citizens and economic operators in accessing procurement's notices and data. Such an investigation on the making up of conceptual models for the public procurement domain, in turn, inspires and motivates a reflection on the role of legal ontologies nowadays, as in the past, retracing the steps of the ``ontological legal thinking'' from Roman Law up to now. I try, at the same time, to forecast the impact, in terms of benefits, challenges and critical issues, of the application of computational models of Law in future e-Governance scenarios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this thesis is to investigate the nature of quantum computation and the question of the quantum speed-up over classical computation by comparing two different quantum computational frameworks, the traditional quantum circuit model and the cluster-state quantum computer. After an introductory survey of the theoretical and epistemological questions concerning quantum computation, the first part of this thesis provides a presentation of cluster-state computation suitable for a philosophical audience. In spite of the computational equivalence between the two frameworks, their differences can be considered as structural. Entanglement is shown to play a fundamental role in both quantum circuits and cluster-state computers; this supports, from a new perspective, the argument that entanglement can reasonably explain the quantum speed-up over classical computation. However, quantum circuits and cluster-state computers diverge with regard to one of the explanations of quantum computation that actually accords a central role to entanglement, i.e. the Everett interpretation. It is argued that, while cluster-state quantum computation does not show an Everettian failure in accounting for the computational processes, it threatens that interpretation of being not-explanatory. This analysis presented here should be integrated in a more general work in order to include also further frameworks of quantum computation, e.g. topological quantum computation. However, what is revealed by this work is that the speed-up question does not capture all that is at stake: both quantum circuits and cluster-state computers achieve the speed-up, but the challenges that they posit go besides that specific question. Then, the existence of alternative equivalent quantum computational models suggests that the ultimate question should be moved from the speed-up to a sort of “representation theorem” for quantum computation, to be meant as the general goal of identifying the physical features underlying these alternative frameworks that allow for labelling those frameworks as “quantum computation”.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

To assess the impact of screening programmes in reducing the prevalence of Chlamydia trachomatis, mathematical and computational models are used as a guideline for decision support. Unfortunately, large uncertainties exist about the parameters that determine the transmission dynamics of C. trachomatis. Here, we use a SEIRS (susceptible-exposed-infected-recovered-susceptible) model to critically analyze the turnover of C. trachomatis in a population and the impact of a screening programme. We perform a sensitivity analysis on the most important steps during an infection with C. trachomatis. Varying the fraction of the infections becoming symptomatic as well as the duration of the symptomatic period within the range of previously used parameter estimates has little effect on the transmission dynamics. However, uncertainties in the duration of temporary immunity and the asymptomatic period can result in large differences in the predicted impact of a screening programme. We therefore analyze previously published data on the persistence of asymptomatic C. trachomatis infection in women and estimate the mean duration of the asymptomatic period to be longer than anticipated so far, namely 433 days (95% CI: 420-447 days). Our study shows that a longer duration of the asymptomatic period results in a more pronounced impact of a screening programme. However, due to the slower turnover of the infection, a substantial reduction in prevalence can only be achieved after screening for several years or decades.