952 resultados para Dynamic simulation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is divided into two distinct parts. The first part consists of the study of the metal organic framework UiO-66Zr, where the aim was to determine the force field that best describes the adsorption equilibrium properties of two different gases, methane and carbon dioxide. The other part of the work focuses on the study of the single wall carbon nanotube topology for ethane adsorption; the aim was to simplify as much as possible the solid-fluid force field model to increase the computational efficiency of the Monte Carlo simulations. The choice of both adsorbents relies on their potential use in adsorption processes, such as the capture and storage of carbon dioxide, natural gas storage, separation of components of biogas, and olefin/paraffin separations. The adsorption studies on the two porous materials were performed by molecular simulation using the grand canonical Monte Carlo (μ,V,T) method, over the temperature range of 298-343 K and pressure range 0.06-70 bar. The calibration curves of pressure and density as a function of chemical potential and temperature for the three adsorbates under study, were obtained Monte Carlo simulation in the canonical ensemble (N,V,T); polynomial fit and interpolation of the obtained data allowed to determine the pressure and gas density at any chemical potential. The adsorption equilibria of methane and carbon dioxide in UiO-66Zr were simulated and compared with the experimental data obtained by Jasmina H. Cavka et al. The results show that the best force field for both gases is a chargeless united-atom force field based on the TraPPE model. Using this validated force field it was possible to estimate the isosteric heats of adsorption and the Henry constants. In the Grand-Canonical Monte Carlo simulations of carbon nanotubes, we conclude that the fastest type of run is obtained with a force field that approximates the nanotube as a smooth cylinder; this approximation gives execution times that are 1.6 times faster than the typical atomistic runs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spin-lattice Relaxation, self-Diffusion coefficients and Residual Dipolar Couplings (RDC’s) are the basis of well established Nuclear Magnetic Resonance techniques for the physicochemical study of small molecules (typically organic compounds and natural products with MW < 1000 Da), as they proved to be a powerful and complementary source of information about structural dynamic processes in solution. The work developed in this thesis consists in the application of the earlier-mentioned NMR techniques to explore, analyze and systematize patterns of the molecular dynamic behavior of selected small molecules in particular experimental conditions. Two systems were chosen to investigate molecular dynamic behavior by these techniques: the dynamics of ion-pair formation and ion interaction in ionic liquids (IL) and the dynamics of molecular reorientation when molecules are placed in oriented phases (alignment media). The application of NMR spin-lattice relaxation and self-diffusion measurements was applied to study the rotational and translational molecular dynamics of the IL: 1-butyl-3-methylimidazolium tetrafluoroborate [BMIM][BF4]. The study of the cation-anion dynamics in neat and IL-water mixtures was systematically investigated by a combination of multinuclear NMR relaxation techniques with diffusion data (using by H1, C13 and F19 NMR spectroscopy). Spin-lattice relaxation time (T1), self-diffusion coefficients and nuclear Overhauser effect experiments were combined to determine the conditions that favor the formation of long lived [BMIM][BF4] ion-pairs in water. For this purpose and using the self-diffusion coefficients of cation and anion as a probe, different IL-water compositions were screened (from neat IL to infinite dilution) to find the conditions where both cation and anion present equal diffusion coefficients (8% water fraction at 25 ºC). This condition as well as the neat IL and the infinite dilution were then further studied by 13C NMR relaxation in order to determine correlation times (c) for the molecular reorientational motion using a mathematical iterative procedure and experimental data obtained in a temperature range between 273 and 353 K. The behavior of self-diffusion and relaxation data obtained in our experiments point at the combining parameters of molar fraction 8 % and temperature 298 K as the most favorable condition for the formation of long lived ion-pairs. When molecules are subjected to soft anisotropic motion by being placed in some special media, Residual Dipolar Couplings (RDCs), can be measured, because of the partial alignment induced by this media. RDCs are emerging as a powerful routine tool employed in conformational analysis, as it complements and even outperforms the approaches based on the classical NMR NOE or J3 couplings. In this work, three different alignment media have been characterized and evaluated in terms of integrity using 2H and 1H 1D-NMR spectroscopy, namely the stretched and compressed gel PMMA, and the lyotropic liquid crystals CpCl/n-hexanol/brine and cromolyn/water. The influence that different media and degrees of alignment have on the dynamic properties of several molecules was explored. Different sized sugars were used and their self-diffusion was determined as well as conformation features using RDCs. The results obtained indicate that no influence is felt by the small molecules diffusion and conformational features studied within the alignment degree range studied, which was the 3, 5 and 6 % CpCl/n-hexanol/brine for diffusion, and 5 and 7.5 % CpCl/n-hexanol/brine for conformation. It was also possible to determine that the small molecules diffusion verified in the alignment media presented close values to the ones observed in water, reinforcing the idea of no conditioning of molecular properties in such media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of today's biggest concerns is the increase of energetic needs, especially in the developed countries. Among various clean energies, wind energy is one of the technologies that assume greater importance on the sustainable development of humanity. Despite wind turbines had been developed and studied over the years, there are phenomena that haven't been yet fully understood. This work studies the soil-structure interaction that occurs on a wind turbine's foundation composed by a group of piles that is under dynamic loads caused by wind. This problem assumes special importance when the foundation is implemented on locations where safety criteria are very demanding, like the case of a foundation mounted on a dike. To the phenomenon of interaction between two piles and the soil between them it's given the name of pile-soil-pile interaction. It is known that such behavior is frequency dependent, and therefore, on this work evaluation of relevant frequencies for the intended analysis is held. During the development of this thesis, two methods were selected in order to assess pile-soil-pile interaction, being one of analytical nature and the other of numerical origin. The analytical solution was recently developed and its called Generalized pile-soil-pile theory, while for the numerical method the commercial nite element software PLAXIS 3D was used. A study of applicability of the numerical method is also done comparing the given solution by the nite element methods with a rigorous solution widely accepted by the majority of the authors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Amyotrophic Lateral Sclerosis (ALS) is a neurodegenerative disease characterized by motor neurons degeneration, which reduces muscular force, being very difficult to diagnose. Mathematical methods are used in order to analyze the surface electromiographic signal’s dynamic behavior (Fractal Dimension (FD) and Multiscale Entropy (MSE)), evaluate different muscle group’s synchronization (Coherence and Phase Locking Factor (PLF)) and to evaluate the signal’s complexity (Lempel-Ziv (LZ) techniques and Detrended Fluctuation Analysis (DFA)). Surface electromiographic signal acquisitions were performed in upper limb muscles, being the analysis executed for instants of contraction for ipsilateral acquisitions for patients and control groups. Results from LZ, DFA and MSE analysis present capability to distinguish between the patient group and the control group, whereas coherence, PLF and FD algorithms present results very similar for both groups. LZ, DFA and MSE algorithms appear then to be a good measure of corticospinal pathways integrity. A classification algorithm was applied to the results in combination with extracted features from the surface electromiographic signal, with an accuracy percentage higher than 70% for 118 combinations for at least one classifier. The classification results demonstrate capability to distinguish members between patients and control groups. These results can demonstrate a major importance in the disease diagnose, once surface electromyography (sEMG) may be used as an auxiliary diagnose method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The existing parking simulations, as most simulations, are intended to gain insights of a system or to make predictions. The knowledge they have provided has built up over the years, and several research works have devised detailed parking system models. This thesis work describes the use of an agent-based parking simulation in the context of a bigger parking system development. It focuses more on flexibility than on fidelity, showing the case where it is relevant for a parking simulation to consume dynamically changing GIS data from external, online sources and how to address this case. The simulation generates the parking occupancy information that sensing technologies should eventually produce and supplies it to the bigger parking system. It is built as a Java application based on the MASON toolkit and consumes GIS data from an ArcGis Server. The application context of the implemented parking simulation is a university campus with free, on-street parking places.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of today’s systems, especially when related to the Web or to multi-agent systems, are not standalone or independent, but are part of a greater ecosystem, where they need to interact with other entities, react to complex changes in the environment, and act both over its own knowledge base and on the external environment itself. Moreover, these systems are clearly not static, but are constantly evolving due to the execution of self updates or external actions. Whenever actions and updates are possible, the need to ensure properties regarding the outcome of performing such actions emerges. Originally purposed in the context of databases, transactions solve this problem by guaranteeing atomicity, consistency, isolation and durability of a special set of actions. However, current transaction solutions fail to guarantee such properties in dynamic environments, since they cannot combine transaction execution with reactive features, or with the execution of actions over domains that the system does not completely control (thus making rolling back a non-viable proposition). In this thesis, we investigate what and how transaction properties can be ensured over these dynamic environments. To achieve this goal, we provide logic-based solutions, based on Transaction Logic, to precisely model and execute transactions in such environments, and where knowledge bases can be defined by arbitrary logic theories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This proposal aims to explore the use of available technologies for video representation of sets and performers in order to serve as support for composition processes and artistic performer rehearsals, while focusing in representing the performer’s body and its movements, and its relation with objects belonging to the three-dimensional space of their performances. This project’s main goal is to design and develop a system that can spatially represent the performer and its movements, by means of capturing processes and reconstruction using a camera device, as well as enhance the three-dimensional space where the performance occurs by allowing interaction with virtual objects and by adding a video component, either for documentary purposes, or for live performances effects (for example, using video mapping video techniques in captured video or projection during a performance).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Sylvatic yellow fever (SYF) is enzootic in Brazil, causing periodic outbreaks in humans living near forest borders or in rural areas. In this study, the cycling patterns of this arbovirosis were analyzed. METHODS: Spectral Fourier analysis was used to capture the periodicity patterns of SYF in time series. RESULTS: SYF outbreaks have not increased in frequency, only in the number of cases. There are two dominant cycles in SYF outbreaks, a seven year cycle for the central-western region and a 14 year cycle for the northern region. Most of the variance was concentrated in the central-western region and dominated the entire endemic region. CONCLUSIONS: The seven year cycle is predominant in the endemic region of the disease due the greater contribution of variance in the central-western region; however, it was possible identify a 14 cycle that governs SYF outbreaks in the northern region. No periodicities were identified for the remaining geographical regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work tests different delta hedging strategies for two products issued by Banco de Investimento Global in 2012. The work studies the behaviour of the delta and gamma of autocallables and their impact on the results when delta hedging with different rebalancing periods. Given its discontinuous payoff and path dependency, it is suggested the hedging portfolio is rebalanced on a daily basis to better follow market changes. Moreover, a mixed strategy is analysed where time to maturity is used as a criterion to change the rebalancing frequency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nestlé’s Dynamic Forecasting Process: Anticipating Risks and Opportunities This Work Project discusses the Nestlé’s Dynamic Forecasting Process, implemented within the organization as a way of reengineering its performance management concept and processes, so as to make it more flexible and capable to react to volatile business conditions. When stressing the importance of demand planning to reallocate resources and enhance performance, Nescafé Dolce Gusto comes as way of seeking improvements on this forecasts’ accuracy and it is thus, by providing a more accurate model on its capsules’ sales, as well as recommending adequate implementations that positively contribute to the referred Planning Process, that value is brought to the Project

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Circulating tumor cells (CTCs) may induce metastases when detached from the primary tumor. The numbers of these cells in blood offers a valuable prognostic indication. Magnetoresistive sensing is an attractive option for CTC counting. In this technique, cells are labeled with nancomposite polymer beads that provide the magnetic signal. Bead properties such as size and magnetic content must be optimized in order to be used as a detection tool in a magnetoresistive platform. Another important component of the platform is the magnet required for proper sensing. Both components are addressed in this work. Nanocomposite polymer beads were produced by nano-emulsion and membrane emulsification. Formulations of the oil phase comprising a mixture of aromatic monomers and iron oxide were employed. The effect of emulsifier (surfactant) concentration on bead size was studied. Formulations of polydimethilsiloxane (PDMS) with different viscosities were also prepared with nano-emulsion method resulting in colloidal beads. Polycaprolactone (PCL) beads were also synthetized by the membrane emulsification method. The beads were characterized by different techiques such as dynamic light scattering (DLS), thermogravimetric analysis (TGA) and scanning electron microscopy (SEM). Additionally, the magnet dimensions of the platform designed to detect CTCs were optimized through a COMSOL multiphysics simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to replicate Apple’s stock market movement by modeling major investment profiles and investors. The present model recreates a live exchange to forecast any predictability in stock price variation, knowing how investors act when it concerns investment decisions. This methodology is particularly relevant if, just by observing historical prices and knowing the tendencies in other players’ behavior, risk-adjusted profits can be made. Empirical research made in the academia shows that abnormal returns are hardly consistent without a clear idea of who is in the market in a given moment and the correspondent market shares. Therefore, even when knowing investors’ individual investment profiles, it is not clear how they affect aggregate markets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho de investigação começou por ser estruturado em torno de quatro grandes capítulos (quatro grandes linhas de orientação temática), todos eles amplamente desenvolvidos no sentido de podermos cartografar alguns dos principais territórios e sintomas da arte contemporânea, sendo certo também, que cada um deles assenta precisamente nos princípios de uma estrutura maleável que, para todos os efeitos, se encontra em processo de construção (work in progress), neste caso, graças à plasticidade do corpo, do espaço, da imagem e do uso criativo das tecnologias digitais, no âmbito das quais, aliás, tudo se parece produzir, transformar e disseminar hoje em dia à nossa volta (quase como se de uma autêntica viagem interactiva se tratasse). Por isso, a partir daqui, todo o esforço que se segue procurará ensaiar uma hipótese de trabalho (desenvolver uma investigação) que, porventura, nos permita desbravar alguns caminhos em direcção aos intermináveis túneis do futuro, sempre na expectativa de podermos dar forma, função e sentido a um desejo irreprimível de liberdade criativa, pois, a arte contemporânea tem essa extraordinária capacidade de nos transportar para muitos outros lugares do mundo, tão reais e imaginários como a nossa própria vida. Assim sendo, há que sumariar algumas das principais etapas a desenvolver ao longo desta investigação. Ora, num primeiro momento, começaremos por reflectir sobre o conceito alargado de «crise» (a crise da modernidade), para logo de seguida podermos abordar a questão da crise das antigas categorias estéticas, questionando assim, para todos os efeitos, quer o conceito de «belo» (Platão) e de «gosto» (Kant), quer ainda o conceito de «forma» (Foccilon), não só no sentido de tentarmos compreender algumas das principais razões que terão estado na origem do chamado «fim da arte» (Hegel), mas também algumas daquelas que terão conduzido à estetização generalizada da experiência contemporânea e à sua respectiva disseminação pelas mais variadas plataformas digitais. Num segundo momento, procuraremos reflectir sobre alguns dos principais problemas da inquietante história das imagens, nomeadamente para tentarmos perceber como é que todas estas transformações técnicas (ligadas ao aparecimento da fotografia, do cinema, do vídeo, do computador e da internet) terão contribuído para o processo de instauração e respectivo alargamento daquilo que todos nós ficaríamos a conhecer como a nova «era da imagem», ou a imagem na «era da sua própria reprodutibilidade técnica» (Benjamin), pois, só assim é que conseguiremos interrogar este imparável processo de movimentação, fragmentação, disseminação, simulação e interacção das mais variadas «formas de vida» (Nietzsche, Agamben). Entretanto, chegados ao terceiro grande momento, interessa-nos percepcionar a arte contemporânea como uma espécie de plataforma interactiva que, por sua vez, nos levará a interpelar alguns dos principais dispositivos metafóricos e experimentais da viagem, neste caso, da viagem enquanto linha facilitadora de acesso à arte, à cultura e à vida contemporânea em geral, ou seja, todo um processo de reflexão que nos incitará a cartografar alguns dos mais atractivos sintomas provenientes da estética do flâneur (na perspectiva de Rimbaud, Baudelaire, Long e Benjamin) e, consequentemente, a convocar algumas das principais sensações decorrentes da experiência altamente sedutora daqueles que vivem mergulhados na órbita interactiva do ciberespaço (na condição de ciberflâneurs), quase como se o mundo inteiro, agora, fosse tão somente um espaço poético «inteiramente navegável» (Manovich). Por fim, no quarto e último momento, procuraremos fazer uma profunda reflexão sobre a inquietante história do corpo, principalmente com o objectivo de reforçar a ideia de que apesar das suas inúmeras fragilidades biológicas (um ser que adoece e morre), o corpo continua a ser uma das «categorias mais persistentes de toda a cultura ocidental» (Ieda Tucherman), não só porque ele resistiu a todas as transformações que lhe foram impostas historicamente, mas também porque ele se soube reinventar e readaptar pacientemente face a todas essas transformações históricas. Sinal evidente de que a sua plasticidade lhe iria conferir, principalmente a partir do século XX («o século do corpo») um estatuto teórico e performativo verdadeiramente especial. Tão especial, aliás, que basta termos uma noção, mesmo que breve, da sua inquietante história para percebermos imediatamente a extraordinária importância dalgumas das suas mais variadas transformações, atracções, ligações e exibições ao longo das últimas décadas, nomeadamente sob o efeito criativo das tecnologias digitais (no âmbito das quais se processam algumas das mais interessantes operações de dinamização cultural e artística do nosso tempo). Em suma, esperamos sinceramente que este trabalho de investigação possa vir a contribuir para o processo de alargamento das fronteiras cada vez mais incertas, dinâmicas e interactivas do conhecimento daquilo que parece constituir, hoje em dia, o jogo fundamental da nossa contemporaneidade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usage of rebars in construction is the most common method for reinforcing plain concrete and thus bridging the tensile stresses along the concrete crack surfaces. Usually design codes for modelling the bond behaviour of rebars and concrete suggest a local bond stress – slip relationship that comprises distinct reinforcement mechanisms, such as adhesion, friction and mechanical anchorage. In this work, numerical simulations of pullout tests were performed using the finite element method framework. The interaction between rebar and concrete was modelled using cohesive elements. Distinct local bond laws were used and compared with ones proposed by the Model Code 2010. Finally an attempt was made to model the geometry of the rebar ribs in conjunction with a material damaged plasticity model for concrete.