973 resultados para Generalized Net


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho de investigação começou por ser estruturado em torno de quatro grandes capítulos (quatro grandes linhas de orientação temática), todos eles amplamente desenvolvidos no sentido de podermos cartografar alguns dos principais territórios e sintomas da arte contemporânea, sendo certo também, que cada um deles assenta precisamente nos princípios de uma estrutura maleável que, para todos os efeitos, se encontra em processo de construção (work in progress), neste caso, graças à plasticidade do corpo, do espaço, da imagem e do uso criativo das tecnologias digitais, no âmbito das quais, aliás, tudo se parece produzir, transformar e disseminar hoje em dia à nossa volta (quase como se de uma autêntica viagem interactiva se tratasse). Por isso, a partir daqui, todo o esforço que se segue procurará ensaiar uma hipótese de trabalho (desenvolver uma investigação) que, porventura, nos permita desbravar alguns caminhos em direcção aos intermináveis túneis do futuro, sempre na expectativa de podermos dar forma, função e sentido a um desejo irreprimível de liberdade criativa, pois, a arte contemporânea tem essa extraordinária capacidade de nos transportar para muitos outros lugares do mundo, tão reais e imaginários como a nossa própria vida. Assim sendo, há que sumariar algumas das principais etapas a desenvolver ao longo desta investigação. Ora, num primeiro momento, começaremos por reflectir sobre o conceito alargado de «crise» (a crise da modernidade), para logo de seguida podermos abordar a questão da crise das antigas categorias estéticas, questionando assim, para todos os efeitos, quer o conceito de «belo» (Platão) e de «gosto» (Kant), quer ainda o conceito de «forma» (Foccilon), não só no sentido de tentarmos compreender algumas das principais razões que terão estado na origem do chamado «fim da arte» (Hegel), mas também algumas daquelas que terão conduzido à estetização generalizada da experiência contemporânea e à sua respectiva disseminação pelas mais variadas plataformas digitais. Num segundo momento, procuraremos reflectir sobre alguns dos principais problemas da inquietante história das imagens, nomeadamente para tentarmos perceber como é que todas estas transformações técnicas (ligadas ao aparecimento da fotografia, do cinema, do vídeo, do computador e da internet) terão contribuído para o processo de instauração e respectivo alargamento daquilo que todos nós ficaríamos a conhecer como a nova «era da imagem», ou a imagem na «era da sua própria reprodutibilidade técnica» (Benjamin), pois, só assim é que conseguiremos interrogar este imparável processo de movimentação, fragmentação, disseminação, simulação e interacção das mais variadas «formas de vida» (Nietzsche, Agamben). Entretanto, chegados ao terceiro grande momento, interessa-nos percepcionar a arte contemporânea como uma espécie de plataforma interactiva que, por sua vez, nos levará a interpelar alguns dos principais dispositivos metafóricos e experimentais da viagem, neste caso, da viagem enquanto linha facilitadora de acesso à arte, à cultura e à vida contemporânea em geral, ou seja, todo um processo de reflexão que nos incitará a cartografar alguns dos mais atractivos sintomas provenientes da estética do flâneur (na perspectiva de Rimbaud, Baudelaire, Long e Benjamin) e, consequentemente, a convocar algumas das principais sensações decorrentes da experiência altamente sedutora daqueles que vivem mergulhados na órbita interactiva do ciberespaço (na condição de ciberflâneurs), quase como se o mundo inteiro, agora, fosse tão somente um espaço poético «inteiramente navegável» (Manovich). Por fim, no quarto e último momento, procuraremos fazer uma profunda reflexão sobre a inquietante história do corpo, principalmente com o objectivo de reforçar a ideia de que apesar das suas inúmeras fragilidades biológicas (um ser que adoece e morre), o corpo continua a ser uma das «categorias mais persistentes de toda a cultura ocidental» (Ieda Tucherman), não só porque ele resistiu a todas as transformações que lhe foram impostas historicamente, mas também porque ele se soube reinventar e readaptar pacientemente face a todas essas transformações históricas. Sinal evidente de que a sua plasticidade lhe iria conferir, principalmente a partir do século XX («o século do corpo») um estatuto teórico e performativo verdadeiramente especial. Tão especial, aliás, que basta termos uma noção, mesmo que breve, da sua inquietante história para percebermos imediatamente a extraordinária importância dalgumas das suas mais variadas transformações, atracções, ligações e exibições ao longo das últimas décadas, nomeadamente sob o efeito criativo das tecnologias digitais (no âmbito das quais se processam algumas das mais interessantes operações de dinamização cultural e artística do nosso tempo). Em suma, esperamos sinceramente que este trabalho de investigação possa vir a contribuir para o processo de alargamento das fronteiras cada vez mais incertas, dinâmicas e interactivas do conhecimento daquilo que parece constituir, hoje em dia, o jogo fundamental da nossa contemporaneidade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relatório de estágio de mestrado em Ensino de Inglês e de Espanhol no 3º Ciclo do Ensino Básico e no Ensino Secundário

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the behavior of c omplex composite materials using mixing procedures is fundamental in several industrial processes. For instance, polymer composites are usually manufactured using dispersion of fillers in polymer melt matrices. The success of the filler dispersion depends both on the complex flow patterns generated and on the polymer melt rheological behavior. Consequently, the availability of a numerical tool that allow to model both fluid and particle would be very useful to increase the process insight. Nowadays there ar e computational tools that allow modeling the behavior of filled systems, taking into account both the behavior of the fluid (Computational Rheology) and the particles (Discrete Element Method). One example is the DPMFoam solver of the OpenFOAM ® framework where the averaged volume fraction momentum and mass conservation equations are used to describe the fluid (continuous phase) rheology, and the Newton’s second law of motion is used to compute the particles (discrete phase) movement. In this work the refer red solver is extended to take into account the elasticity of the polymer melts for the continuous phase. The solver capabilities will be illustrated by studying the effect of the fluid rheology on the filler dispersion, taking into account different fluid types (generalized Newtonian or viscoelastic) and particles volume fraction and size. The results obtained are used to evaluate the relevance of considering the fluid complex rheology for the prediction of the composites morphology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the mixing process of complex composite materials is fundamental in several industrial processes. For instance, the dispersion of fillers in polymer melt matrices is commonly employed to manufacture polymer composites, using a twin-screw extruder. The effectiveness of the filler dispersion depends not only on the complex flow patterns generated, but also on the polymer melt rheological behavior. Therefore, the availability of a numerical tool able to predict mixing, taking into account both fluid and particles phases would be very useful to increase the process insight, and thus provide useful guidelines for its optimization. In this work, a new Eulerian-Lagrangian numerical solver is developed OpenFOAM® computational library, and used to better understand the mechanisms determining the dispersion of fillers in polymer matrices. Particular attention will be given to the effect of the rheological model used to represent the fluid behavior, on the level of dispersion obtained. For the Eulerian phase the averaged volume fraction governing equations (conservation of mass and linear momentum) are used to describe the fluid behavior. In the case of the Lagrangian phase, Newton’s second law of motion is used to compute the particles trajectories and velocity. To study the effect of fluid behavior on the filler dispersion, several systems are modeled considering different fluid types (generalized Newtonian or viscoelastic) and particles volume fraction and size. The results obtained are used to correlate the fluid and particle characteristics on the effectiveness of mixing and morphology obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In several industrial applications, highly complex behaviour materials are used together with intricate mixing processes, which difficult the achievement of the desired properties for the produced materials. This is the case of the well-known dispersion of nano-sized fillers in a melt polymer matrix, used to improve the nanocomposite mechanical and/or electrical properties. This mixing is usually performed in twin-screw extruders, that promote complex flow patterns, and, since an in loco analysis of the material evolution and mixing is difficult to perform, numerical tools can be very useful to predict the evolution and behaviour of the material. This work presents a numerical based study to improve the understanding of mixing processes. Initial numerical studies were performed with generalized Newtonian fluids, but, due to the null relaxation time that characterize this type of fluids, the assumption of viscoelastic behavior was required. Therefore, the polymer melt was rheologically characterized, and, a six mode Phan-Thien-Tanner and Giesekus models were used to fit the rheological data. These viscoelastic rheological models were used to model the process. The conclusions obtained in this work provide additional and useful data to correlate the type and intensity of the deformation history promoted to the polymer nanocomposite and the quality of the mixing obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Psicologia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Immersive environments (IE) are being increasingly used in order to perform psychophysical experiments. The versatility in terms of stimuli presentation and control and the less time-consuming procedures are their greatest strengths. However, to ensure that IE results can be generalized to real world scenarios we must first provide evidence that performance in IE is quantitatively indistinguishable from performance in real-world. Our goal was to perceptually validate distance perception for CAVE-like IEs. Participants performed a Frontal Matching Distance Task (Durgin & Li, 2011) in three different conditions: real-world scenario (RWS); photorealistic IE (IEPH) and non-photorealistic IE (IENPH). Underestimation of distance was found across all the conditions, with a significant difference between the three conditions (Wilks’ Lambda = .38, F(2,134)= 110.8, p<.01, significant pairwise differences with p<.01). We found a mean error of 2.3 meters for the RWS, 5 meters for the IEPH, and of 6 meters for the IENPH in a pooled data set of 5 participants. Results indicate that while having a photorealistic IE with perspective and stereoscopic depth cues might not be enough to elicit a real-world performance in distance judgment tasks, nevertheless this type of environment minimizes the discrepancy between simulation and real-world when compared with non-photorealistic IEs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado em Psicologia Aplicada

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciências da Administração

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is currently an increasing demand for robots able to acquire the sequential organization of tasks from social learning interactions with ordinary people. Interactive learning-by-demonstration and communication is a promising research topic in current robotics research. However, the efficient acquisition of generalized task representations that allow the robot to adapt to different users and contexts is a major challenge. In this paper, we present a dynamic neural field (DNF) model that is inspired by the hypothesis that the nervous system uses the off-line re-activation of initial memory traces to incrementally incorporate new information into structured knowledge. To achieve this, the model combines fast activation-based learning to robustly represent sequential information from single task demonstrations with slower, weight-based learning during internal simulations to establish longer-term associations between neural populations representing individual subtasks. The efficiency of the learning process is tested in an assembly paradigm in which the humanoid robot ARoS learns to construct a toy vehicle from its parts. User demonstrations with different serial orders together with the correction of initial prediction errors allow the robot to acquire generalized task knowledge about possible serial orders and the longer term dependencies between subgoals in very few social learning interactions. This success is shown in a joint action scenario in which ARoS uses the newly acquired assembly plan to construct the toy together with a human partner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adatom-decorated graphene offers a promising new path towards spintronics in the ultrathin limit. We combine experiment and theory to investigate the electronic properties of dilutely fluorinated bilayer graphene, where the fluorine adatoms covalently bond to the top graphene layer. We show that fluorine adatoms give rise to resonant impurity states near the charge neutrality point of the bilayer, leading to strong scattering of charge carriers and hopping conduction inside a field-induced band gap. Remarkably, the application of an electric field across the layers is shown to tune the resonant scattering amplitude from fluorine adatoms by nearly twofold. The experimental observations are well explained by a theoretical analysis combining Boltzmann transport equations and fully quantum-mechanical methods. This paradigm can be generalized to many bilayer graphene-adatom materials, and we envision that the realization of electrically tunable resonance may be a key advantage in graphene-based spintronic devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we characterize the existence and give an expression of the group inverse of a product of two regular elements by means of a ring unit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of orthogonal polynomials of one real or complex variable is well established as well as its generalization for the multidimensional case. Hypercomplex function theory (or Clifford analysis) provides an alternative approach to deal with higher dimensions. In this context, we study systems of orthogonal polynomials of a hypercomplex variable with values in a Clifford algebra and prove some of their properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Amanita phalloides is responsible for more than 90 % of mushroom-related fatalities, and no effective antidote is available. a-Amanitin, the main toxin of A. phalloides, inhibits RNA polymerase II (RNAP II), causing hepatic and kidney failure. In silico studies included docking and molecular dynamics simulation coupled to molecular mechanics with generalized Born and surface area method energy decomposition on RNAP II. They were performed with a clinical drug that shares chemical similarities to a-amanitin, polymyxin B. The results show that polymyxin B potentially binds to RNAP II in the same interface of a-amanitin, preventing the toxin from binding to RNAP II. In vivo, the inhibition of the mRNA transcripts elicited by a-amanitin was efficiently reverted by polymyxin B in the kidneys. Moreover, polymyxin B significantly decreased the hepatic and renal a-amanitin-induced injury as seen by the histology and hepatic aminotransferases plasma data. In the survival assay, all animals exposed to a-amanitin died within 5 days, whereas 50 % survived up to 30 days when polymyxin B was administered 4, 8, and 12 h post-a-amanitin. Moreover, a single dose of polymyxin B administered concomitantly with a-amanitin was able to guarantee 100 % survival. Polymyxin B protects RNAP II from inactivation leading to an effective prevention of organ damage and increasing survival in a-amanitin-treated animals. The present use of clinically relevant concentrations of an already human-use-approved drug prompts the use of polymyxin B as an antidote for A. phalloides poisoning in humans.