985 resultados para Analisi limite muratura algoritmo collasso sismica
Resumo:
This dissertation presents a methodology to the optimization of a predial system of cold water distribution. It s about a study of a case applied to the Tropical Buzios Residential Condominium, located in the Búzio s Beach, Nísia Floresta city, the east coast of the Rio Grande do Norte state, twenty kilometers far from Natal. The design of cold water distribution networks according to Norm NBR 5626 of the ABNT - Brazilian Association of Techniques Norms, does not guarantee that the joined solution is the optimal solution of less cost. It s necessary the use of an optimization methodology, that supplies us, between all the possible solutions, the minimum cost solution. In the optimization process of the predial system of water distribution of the Tropical Búzios Condominium, is used Method Granados, that is an iterative algorithm of optimization, based on the Dynamic Programming, that supplies the minimum cost s network, in function of the piezometric quota of the reservoir. For the application of this Method in ramifies networks, is used a program of computer in C language. This process is divided in two stages: attainment of the previous solution and reduction of the piezometric quota of headboard. In the attainment of the previous solution, the minors possible diameters are used that guarantee the limit of maximum speed and the requirements of minimum pressures. The piezometric quota of headboard is raised to guarantee these requirements. In the second stage of the Granados Method, an iterative process is used and it objective is to reduce the quota of headboard gradually, considering the substitution of stretches of the network pipes for the subsequent diameters, considering a minimum addition of the network cost. The diameter change is made in the optimal stretch that presents the lesser Exchange Gradient. The process is locked up when the headboard quota of desired is reached. The optimized network s material costs are calculated, and is made the analysis of the same ones, through the comparison with the conventional network s costs
Resumo:
The present essay shows strategies of improvement in a well succeded evolutionary metaheuristic to solve the Asymmetric Traveling Salesman Problem. Such steps consist in a Memetic Algorithm projected mainly to this problem. Basically this improvement applied optimizing techniques known as Path-Relinking and Vocabulary Building. Furthermore, this last one has being used in two different ways, in order to evaluate the effects of the improvement on the evolutionary metaheuristic. These methods were implemented in C++ code and the experiments were done under instances at TSPLIB library, being possible to observe that the procedures purposed reached success on the tests done
Resumo:
Frequentemente, os indivíduos com perda auditiva têm dificuldade de entender a fala no ambiente ruidoso. OBJETIVO: O objetivo deste estudo foi avaliar clinicamente o desempenho dos indivíduos adultos com deficiência auditiva neurossensorial, com relação à percepção da fala, utilizando o aparelho de amplificação sonora individual digital com o algoritmo de redução de ruído denominado Speech Sensitive Processing, ativado e desativado na presença de um ruído. MATERIAL E MÉTODO: Este estudo de casos foi realizado em 32 indivíduos com deficiência auditiva neurossensorial de graus leve, moderado ou leve a moderado. Foi realizada a avaliação por meio de um teste de percepção de fala, onde se pesquisou o reconhecimento de sentenças na presença de um ruído, para obter a relação sinal/ruído, utilizando o aparelho auditivo digital. RESULTADOS: O algoritmo pôde proporcionar benefício para a maioria dos indivíduos deficientes auditivos, na pesquisa da relação sinal/ruído e os resultados apontaram diferença estatisticamente significante na condição em que o algoritmo encontrava-se ativado, comparado quando o algoritmo não se encontrava ativado. CONCLUSÃO: O uso do algoritmo de redução de ruído deve ser pensado como alternativa clínica, pois observamos a eficácia desse sistema na redução do ruído, melhorando a percepção da fala.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
JUSTIFICATIVA E OBJETIVOS: As lesões da mucosa traqueal em contato com o balonete do tubo traqueal são proporcionais à pressão exercida pelo balonete e ao tempo de exposição. O objetivo foi estudar as eventuais lesões da mucosa do segmento traqueal em contato com o balonete do tubo traqueal insuflado com volume de ar suficiente para se obter pressão de "selo" ou com a pressão limite de 25 cmH2O, abaixo da pressão crítica de 30 cm de água para produção de lesão da mucosa traqueal. MÉTODO: Dezesseis cães foram submetidos à anestesia venosa e ventilação artificial. Os cães foram distribuídos aleatoriamente em dois grupos de acordo com a pressão no balonete do tubo traqueal (Portex Blue-Line, Inglaterra): Gselo (n = 8) balonete com pressão mínima de "selo" para impedir vazamento de ar durante a respiração artificial; G25 (n = 8) balonete insuflado até obtenção da pressão de 25 cmH2O. A medida da pressão do balonete foi realizada por meio de manômetro digital no início (controle) e após 60, 120 e 180 minutos. Após o sacrifício dos cães, foram feitas biópsias nas áreas da mucosa traqueal adjacentes ao balonete e ao tubo traqueal para análise à microscopia eletrônica de varredura (MEV). RESULTADOS: A pressão média do balonete em G25 manteve-se entre 24,8 e 25 cmH2O e em Gselo entre 11,9 e 12,5 cmH2O durante o experimento. As alterações à MEV foram pequenas e não significantemente diferentes nos grupos (p > 0,30), mas ocorreram lesões mais intensas nas áreas de contato da mucosa traqueal com o balonete do tubo traqueal, nos dois grupos, em relação às áreas da mucosa adjacentes ou não ao tubo traqueal (p < 0,05). CONCLUSÕES: No cão, nas condições experimentais empregadas, a insuflação do balonete de tubo traqueal em volume de ar suficiente para determinar pressão limite de 25 cmH2O ou de "selo" para impedir vazamento de ar determina lesões mínimas da mucosa traqueal em contato com o balonete e sem diferença significante entre elas.
Resumo:
This work seeks to propose and evaluate a change to the Ant Colony Optimization based on the results of experiments performed on the problem of Selective Ride Robot (PRS, a new problem, also proposed in this paper. Four metaheuristics are implemented, GRASP, VNS and two versions of Ant Colony Optimization, and their results are analyzed by running the algorithms over 32 instances created during this work. The metaheuristics also have their results compared to an exact approach. The results show that the algorithm implemented using the GRASP metaheuristic show good results. The version of the multicolony ant colony algorithm, proposed and evaluated in this work, shows the best results
Resumo:
Este trabalho aborda o problema de otimização em braquiterapia de alta taxa de dose no tratamento de pacientes com câncer, com vistas à definição do conjunto de tempos de parada. A técnica de solução adotada foi a Transgenética Computacional apoiada pelo método L-BFGS. O algoritmo desenvolvido foi empregado para gerar soluções não denominadas cujas distribuições de dose fossem capazes de eiminar o câncer e ao mesmo tempo preservar as regiões normais
Resumo:
This work consists on the study of two important problems arising from the operations of petroleum and natural gas industries. The first problem the pipe dimensioning problem on constrained gas distribution networks consists in finding the least cost combination of diameters from a discrete set of commercially available ones for the pipes of a given gas network, such that it respects minimum pressure requirements at each demand node and upstream pipe conditions. On its turn, the second problem the piston pump unit routing problem comes from the need of defining the piston pump unit routes for visiting a number of non-emergent wells in on-shore fields, i.e., wells which don t have enough pressure to make the oil emerge to surface. The periodic version of this problem takes into account the wells re-filling equation to provide a more accurate planning in the long term. Besides the mathematical formulation of both problems, an exact algorithm and a taboo search were developed for the solution of the first problem and a theoretical limit and a ProtoGene transgenetic algorithm were developed for the solution of the second problem. The main concepts of the metaheuristics are presented along with the details of their application to the cited problems. The obtained results for both applications are promising when compared to theoretical limits and alternate solutions, either relative to the quality of the solutions or to associated running time
Resumo:
Web services are computational solutions designed according to the principles of Service Oriented Computing. Web services can be built upon pre-existing services available on the Internet by using composition languages. We propose a method to generate WS-BPEL processes from abstract specifications provided with high-level control-flow information. The proposed method allows the composition designer to concentrate on high-level specifi- cations, in order to increase productivity and generate specifications that are independent of specific web services. We consider service orchestrations, that is compositions where a central process coordinates all the operations of the application. The process of generating compositions is based on a rule rewriting algorithm, which has been extended to support basic control-flow information.We created a prototype of the extended refinement method and performed experiments over simple case studies
Resumo:
The Hiker Dice was a game recently proposed in a software designed by Mara Kuzmich and Leonardo Goldbarg. In the game a dice is responsible for building a trail on an n x m board. As the dice waits upon a cell on the board, it prints the side that touches the surface. The game shows the Hamiltonian Path Problem Simple Maximum Hiker Dice (Hidi-CHS) in trays Compact Nth , this problem is then characterized by looking for a Hamiltonian Path that maximize the sum of marked sides on the board. The research now related, models the problem through Graphs, and proposes two classes of solution algorithms. The first class, belonging to the exact algorithms, is formed by a backtracking algorithm planed with a return through logical rules and limiting the best found solution. The second class of algorithms is composed by metaheuristics type Evolutionary Computing, Local Ramdomized search and GRASP (Greed Randomized Adaptative Search). Three specific operators for the algorithms were created as follows: restructuring, recombination with two solutions and random greedy constructive.The exact algorithm was teste on 4x4 to 8x8 boards exhausting the possibility of higher computational treatment of cases due to the explosion in processing time. The heuristics algorithms were tested on 5x5 to 14x14 boards. According to the applied methodology for evaluation, the results acheived by the heuristics algorithms suggests a better performance for the GRASP algorithm
Resumo:
Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets
Resumo:
The Traveling Purchaser Problem is a variant of the Traveling Salesman Problem, where there is a set of markets and a set of products. Each product is available on a subset of markets and its unit cost depends on the market where it is available. The objective is to buy all the products, departing and returning to a domicile, at the least possible cost defined as the summation of the weights of the edges in the tour and the cost paid to acquire the products. A Transgenetic Algorithm, an evolutionary algorithm with basis on endosymbiosis, is applied to the Capacited and Uncapacited versions of this problem. Evolution in Transgenetic Algorithms is simulated with the interaction and information sharing between populations of individuals from distinct species. The computational results show that this is a very effective approach for the TPP regarding solution quality and runtime. Seventeen and nine new best results are presented for instances of the capacited and uncapacited versions, respectively
Resumo:
The aim of this study was to access the P-t(Lim) model in swimming, applying the load control available in full tethered swim condition. Its physiological meaning for the determination of boundary of heavy/severe domains was assessed from the relationships with critical velocity (CV), critical power (CP) and maximal lactate steady state (MLSS). The velocity at MLSS (v(MLSS) = 1.17 +/- 0.11 m/s) and CV (1.19 +/- 0.12 m/s) were significantly different. Similarly, the power at MLSS (p(MFEL) = 89.2 +/- 15.1 W) and CP (99.4 +/- 22.9 W) were significantly different. There was no difference between lactate concentration at vMLSS (3.54 +/- 0.9 mM) and p(MLSS) (3.76 +/- 0.6 mM). Significant Pearson's coefficients (r > 0.70) were observed among v(MLSS) and P-MLSS with their respective values on time-limited model. Thus, the tethered-crawl condition seems to be valid to determine the boundary of heavy/severe domains, and to access the aerobic capacity of swimmers.
Resumo:
Os objetivos deste trabalho foram desenvolver o método de análise isotópica para quantificar o carbono do ciclo fotossintético C3 em néctares de laranja comerciais e mensurar o limite de legalidade, baseado na legislação brasileira, para identificar as bebidas que não estão em conformidade com o Ministério da Agricultura, Pecuária e Abastecimento (MAPA). As bebidas foram produzidas em laboratório, conforme a legislação brasileira. Também foram produzidos néctares adulterados com quantidade de suco de laranja abaixo do limite mínimo permitido pelo MAPA. Na análise isotópica, foi mensurado o enriquecimento isotópico relativo dos néctares de laranja e também de suas frações, sólidos insolúveis (polpa) e açúcar purificado. Com esses resultados, foi estimada a quantidade de fonte C3 por meio da equação da diluição isotópica. Para determinar a existência de adulteração, foi necessária a criação do limite de legalidade de acordo com a legislação brasileira. Oito marcas comerciais de néctar de laranja foram analisadas. Todas foram classificadas como legais. O limite de legalidade foi uma importante inovação metodológica, que possibilitou identificar as bebidas que estavam em conformidade com a legislação brasileira. A metodologia desenvolvida provou ser eficiente para quantificar o carbono de origem C3 em néctares de laranja comerciais.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)