520 resultados para Artifact


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Com a proliferação das mídias, conectadas à rede mundial de computadores, tem-se expandido as oportunidades de expressão e criação de um público cada vez mais crescente. Este advento tecnológico, associado às características do sujeito pertencente ao conceito denominado modernidade líquida, vêm alterando gradualmente as relações sociais. A motivação deste trabalho advém da urgência em sinalizar ao contexto educativo, desde o ensino básico ao acadêmico, sobre as alterações ocorridas no perfil de seu público alvo, tornando-se necessária uma revisão nos métodos, para que esses venham acrescentar abordagens aos desafios emergentes. O primeiro objetivo, a partir do referencial teórico e estudo de caso, visa identificar as características do sujeito influenciado pelo consumismo midiático, apontando para as narrativas transmídia como influências advindas do mercado de consumo interativo, destacando, contudo o interesse pela arte da narrativa. O segundo objetivo, na mesma estrutura, procura destacar exemplos de metodologias que têm o contexto acadêmico como lócus de integração e diálogos entre áreas afins. Também, procura aproximar do contexto dos meios de comunicação transmidiáticas, em seus regimes de controle, recursos de consumo e processos formativos informais, advindos destas dinâmicas. O terceiro objetivo propõe reflexões sobre os resultados de uma proposta com dois grupos de pesquisa e os expectadores de uma instalação de arte digital intitulada E-Reflexos. A dinâmica permeou em integrar os estudantes e espectadores em mestiçagens, unindo abordagens metodológicas aos conhecimentos prévios, na criação de narrativas e jogos, onde os conceitos e reflexões, advindos deste trabalho, foram problematizados e avaliados. O quarto objetivo integra reflexões sobre os principais resultados dos objetivos acima, considerando que é possível ao contexto acadêmico mediar propostas que venham reverter o potencial narrador midiático emergente para o contexto cultural, destacando a relevância das áreas de Arte nesse processo formativo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente trabalho pretende apresentar e descrever a metodologia processual e respectivas relexões que sustentam a criação de um artefato digital interativo construído de forma a que alguns dos elementos bidimensionais que o constituem sejam manipuladas de forma a que criem ao jogador a ilusão de tridimensionalidade. Em 1992 a Id Software com o jogo Wolfenstein 3D, introduziu uma referência visual à tridimensionalidade, utilizando para o efeito tecnologia 2D, a qual, através de um sistema de redimensionamento e posicionamento de imagens, consegui transmitir a noção de tridimensionalidade, utilizando na altura um tipo de jogo em primeira pessoa, ou seja, o jogador experiência uma campo visual que visa reproduzir a própria experiência do mundo táctil na relação que dispõe entre os espaços e objetos. Através do Processing, uma linguagem de programação que assenta no Java, estes objetivos conceptuais serão reproduzidos, que procuram, por um lado, transmitir esta aparente ilusão de tridimensionalidade e por outro não utilizar um tipo de artefacto digital que proporciona uma jogabilidade em primeira pessoa mas sim possibilitam ao jogador uma experiência visual que aborda todo o espaço em que é lhe permitido circular, no qual é lhe exposto as adversidades que precisa de superar para progredir. Para que isto seja possível o jogador assume o papel de um personagem e através da sua interação com o artefato, vai ediicando uma narrativa visual que visa o seu envolvimento com a temática representada. Como tema é utilizada uma representação da busca pelo Sarcófago do faraó da 18ª Dinastia Tutankamón (1332 - 1323 AC) pelo explorador britânico Howard Carter (1874 - 1939) cuja expedição no Vales do Reis em 1922 constitui ainda hoje a mais célebre descoberta arqueológica relacionada com Antigo Egipto. Ao longo desta Dissertação são abordados temas que visam resoluções tanto no campo técnico e tecnológico, através da programação e sua linguagem, como no campo visual e estético que visa uma conexão consciente com a temática a representar e a ser experienciada

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research aims to study dimensions of urban life in the contemporaneous city. It is an effort to understand the functioning of the contemporary city as an artifact that somehow affects social relations. The study focuses on the limits and possibilities of urbanity in the city today, understanding urbanity as a set of factors that favor wealth, diversity and spontaneity of public life. The research aims to show that cities today tend to criate fragmented urban life into at least one of the three urbanity dimensions: spatial dimension, social and temporal dimension. The study involves the analysis of two public spaces in Fortaleza (Praça do Ferreira and the open urban public spaces of the Centro Cultural Dragão do Mar), using Space Syntax Analysis methods and for Post Occupancy Evaluation procedures. Research shows that temporal dimension of urbanity is limited in the public spaces studied. In Praça do Ferreira, spatial and social dimensions are present, but their effects are limited by the temporal dimension. The Dragão do Mar, on the other hand, the spatial and social dimensions of urban life are more limited and more concentrated in time

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electrical and optical coupling between subcells in a multijunction solar cell affects its external quantum efficiency (EQE) measurement. In this study, we show how a low breakdown voltage of a component subcell impacts the EQE determination of a multijunction solar cell and demands the use of a finely adjusted external voltage bias. The optimum voltage bias for the EQE measurement of a Ge subcell in two different GaInP/GaInAs/Ge triple-junction solar cells is determined both by sweeping the external voltage bias and by tracing the I–V curve under the same light bias conditions applied during the EQE measurement. It is shown that the I–V curve gives rapid and valuable information about the adequate light and voltage bias needed, and also helps to detect problems associated with non-ideal I–V curves that might affect the EQE measurement. The results also show that, if a non-optimum voltage bias is applied, a measurement artifact can result. Only when the problems associated with a non-ideal I–V curve and/or a low breakdown voltage have been discarded, the measurement artifacts, if any, can be attributed to other effects such as luminescent coupling between subcells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The produced water is a byproduct formed due to production of petroleum and carries with it a high amount of contaminants such as oil particles in suspension, organic compounds and metals. Thus, these latter pollutants are very difficult to treat because of its high solubility in water. The objective of this work is to use and evaluate a microemulsioned system to remove metals ( K , Mg , Ba , Ca , Cr , Mn , Li , Fe ) of synthetic produced water. For the extraction of metals, it was used a pseudoternary diagram containing the following phases: synthetic produced water as the aqueous phase (AP), hexane as organic phase (OP), and a cosurfactant/surfactant ratio equal to four (C/S = 4) as the third phase, where the OCS (saponified coconut oil) was used as surfactant and n-butanol as cosurfactant. The synthetic produced water was prepared in a bench scale and the region of interest in the diagram for the removal of metals was determined by experimental design called. Ten points located in the phase Winsor II were selected in an area with a large amount of water and small amounts of reagents. The samples were analyzed in atomic absorption spectrometer, and the results were evaluated through a statistical assesment, allowing the efficiency analysis of the effects and their interactions. The results showed percentages of extraction above 90% for the metals manganese, iron, chromium, calcium, barium and magnesium, and around 45% for metals lithium and potassium. The optimal point for the simultaneous removal of metals was calculated using statistical artifact multiple response function (MR). This calculation showed that the point of greatest extraction of metals occurs was the J point, with the composition [72% AP, 9% OP, 19% C/S], obtaining a global extraction percentage about 80%. Considering the aspects analyzed, the microemulsioned system has shown itself to be an effective alternative in the extraction of metals on synthetic produced water remediation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measurement and modeling techniques were developed to improve over-water gaseous air-water exchange measurements for persistent bioaccumulative and toxic chemicals (PBTs). Analytical methods were applied to atmospheric measurements of hexachlorobenzene (HCB), polychlorinated biphenyls (PCBs), and polybrominated diphenyl ethers (PBDEs). Additionally, the sampling and analytical methods are well suited to study semivolatile organic compounds (SOCs) in air with applications related to secondary organic aerosol formation, urban, and indoor air quality. A novel gas-phase cleanup method is described for use with thermal desorption methods for analysis of atmospheric SOCs using multicapillary denuders. The cleanup selectively removed hydrogen-bonding chemicals from samples, including much of the background matrix of oxidized organic compounds in ambient air, and thereby improved precision and method detection limits for nonpolar analytes. A model is presented that predicts gas collection efficiency and particle collection artifact for SOCs in multicapillary denuders using polydimethylsiloxane (PDMS) sorbent. An approach is presented to estimate the equilibrium PDMS-gas partition coefficient (Kpdms) from an Abraham solvation parameter model for any SOC. A high flow rate (300 L min-1) multicapillary denuder was designed for measurement of trace atmospheric SOCs. Overall method precision and detection limits were determined using field duplicates and compared to the conventional high-volume sampler method. The high-flow denuder is an alternative to high-volume or passive samplers when separation of gas and particle-associated SOCs upstream of a filter and short sample collection time are advantageous. A Lagrangian internal boundary layer transport exchange (IBLTE) Model is described. The model predicts the near-surface variation in several quantities with fetch in coastal, offshore flow: 1) modification in potential temperature and gas mixing ratio, 2) surface fluxes of sensible heat, water vapor, and trace gases using the NOAA COARE Bulk Algorithm and Gas Transfer Model, 3) vertical gradients in potential temperature and mixing ratio. The model was applied to interpret micrometeorological measurements of air-water exchange flux of HCB and several PCB congeners in Lake Superior. The IBLTE Model can be applied to any scalar, including water vapor, carbon dioxide, dimethyl sulfide, and other scalar quantities of interest with respect to hydrology, climate, and ecosystem science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis focuses on advanced reconstruction methods and Dual Energy (DE) Computed Tomography (CT) applications for proton therapy, aiming at improving patient positioning and investigating approaches to deal with metal artifacts. To tackle the first goal, an algorithm for post-processing input DE images has been developed. The outputs are tumor- and bone-canceled images, which help in recognising structures in patient body. We proved that positioning error is substantially reduced using contrast enhanced images, thus suggesting the potential of such application. If positioning plays a key role in the delivery, even more important is the quality of planning CT. For that, modern CT scanners offer possibility to tackle challenging cases, like treatment of tumors close to metal implants. Possible approaches for dealing with artifacts introduced by such rods have been investigated experimentally at Paul Scherrer Institut (Switzerland), simulating several treatment plans on an anthropomorphic phantom. In particular, we examined the cases in which none, manual or Iterative Metal Artifact Reduction (iMAR) algorithm were used to correct the artifacts, using both Filtered Back Projection and Sinogram Affirmed Iterative Reconstruction as image reconstruction techniques. Moreover, direct stopping power calculation from DE images with iMAR has also been considered as alternative approach. Delivered dose measured with Gafchromic EBT3 films was compared with the one calculated in Treatment Planning System. Residual positioning errors, daily machine dependent uncertainties and film quenching have been taken into account in the analyses. Although plans with multiple fields seemed more robust than single field, results showed in general better agreement between prescribed and delivered dose when using iMAR, especially if combined with DE approach. Thus, we proved the potential of these advanced algorithms in improving dosimetry for plans in presence of metal implants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Describes the position claiming that the contemporary technologi- cal, sociopolitical, and socioeconomic environment gives us pause to consider the core theory and practices of bibliography, combin- ing bibliography of the work (in library and information science), bibliography of the text (in textual studies and scholarly editing), and bibliography of the artifact (in book history and now digital forensics), and calls for collaborative multidisciplinary research at the intersection of these fields to ask, is there a new bibliography?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A prospective randomised controlled clinical trial of treatment decisions informed by invasive functional testing of coronary artery disease severity compared with standard angiography-guided management was implemented in 350 patients with a recent non-ST elevation myocardial infarction (NSTEMI) admitted to 6 hospitals in the National Health Service. The main aims of this study were to examine the utility of both invasive fractional flow reserve (FFR) and non-invasive cardiac magnetic resonance imaging (MRI) amongst patients with a recent diagnosis of NSTEMI. In summary, the findings of this thesis are: (1) the use of FFR combined with intravenous adenosine was feasible and safe amongst patients with NSTEMI and has clinical utility; (2) there was discordance between the visual, angiographic estimation of lesion significance and FFR; (3). The use of FFR led to changes in treatment strategy and an increase in prescription of medical therapy in the short term compared with an angiographically guided strategy; (4) in the incidence of major adverse cardiac events (MACE) at 12 months follow up was similar in the two groups. Cardiac MRI was used in a subset of patients enrolled in two hospitals in the West of Scotland. T1 and T2 mapping methods were used to delineate territories of acute myocardial injury. T1 and T2 mapping were superior when compared with conventional T2-weighted dark blood imaging for estimation of the ischaemic area-at-risk (AAR) with less artifact in NSTEMI. There was poor correlation between the angiographic AAR and MRI methods of AAR estimation in patients with NSTEMI. FFR had a high accuracy at predicting inducible perfusion defects demonstrated on stress perfusion MRI. This thesis describes the largest randomized trial published to date specifically looking at the clinical utility of FFR in the NSTEMI population. We have provided evidence of the diagnostic and clinical utility of FFR in this group of patients and provide evidence to inform larger studies. This thesis also describes the largest ever MRI cohort, including with myocardial stress perfusion assessments, specifically looking at the NSTEMI population. We have demonstrated the diagnostic accuracy of FFR to predict reversible ischaemia as referenced to a non-invasive gold standard with MRI. This thesis has also shown the futility of using dark blood oedema imaging amongst all comer NSTEMI patients when compared to novel T1 and T2 mapping methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Every construction process (whatever buildings, machines, software, etc.) requires first to make a model of the artifact that is going to be built. This model should be based on a paradigm or meta-model, which defines the basic modeling elements: which real world concepts can be represented, which relationships can be established among them, and son on. There also should be a language to represent, manipulate and think about that model. Usually this model should be redefined at various levels of abstraction. So both, the paradigm an the language, must have abstraction capacity. In this paper I characterize the relationships that exist between these concepts: model, language and abstraction. I also analyze some historical models, like the relational model for databases, the imperative programming model and the object oriented model. Finally, I remark the need to teach that model-driven approach to students, and even go further to higher level models, like component models o business models.