866 resultados para New Media Art
Resumo:
RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.
Resumo:
Generating personalized movie recommendations to users is a problem that most commonly relies on user-movie ratings. These ratings are generally used either to understand the user preferences or to recommend movies that users with similar rating patterns have rated highly. However, movie recommenders are often subject to the Cold-Start problem: new movies have not been rated by anyone, so, they will not be recommended to anyone; likewise, the preferences of new users who have not rated any movie cannot be learned. In parallel, Social-Media platforms, such as Twitter, collect great amounts of user feedback on movies, as these are very popular nowadays. This thesis proposes to explore feedback shared on Twitter to predict the popularity of new movies and show how it can be used to tackle the Cold-Start problem. It also proposes, at a finer grain, to explore the reputation of directors and actors on IMDb to tackle the Cold-Start problem. To assess these aspects, a Reputation-enhanced Recommendation Algorithm is implemented and evaluated on a crawled IMDb dataset with previous user ratings of old movies,together with Twitter data crawled from January 2014 to March 2014, to recommend 60 movies affected by the Cold-Start problem. Twitter revealed to be a strong reputation predictor, and the Reputation-enhanced Recommendation Algorithm improved over several baseline methods. Additionally, the algorithm also proved to be useful when recommending movies in an extreme Cold-Start scenario, where both new movies and users are affected by the Cold-Start problem.
Resumo:
Contém artigos apresentados na International Conference “Uncertain Spaces: Virtual Configurations in Contemporary Art and Museums”, na Fundação Calouste Gulbenkian (Lisboa), 31 Outubro - 1 de Novembro de 2014) de: Helena Barranha e Susana S. Martins - Introduction: Art, Museums and Uncertainty (pp.1-12); Alexandra Bounia e Eleni Myrivili - Beyond the ‘Virtual’: Intangible Museographies and Collaborative Museum Experiences (pp.15-32); Annet Dekker - Curating in Progress. Moving Between Objects and Processes (pp.33-54); Giselle Beiguelman - Corrupted Memories. The aesthetics of Digital Ruins and the Museum of the Unfinished (pp.55-82); Andrew Vaas Brooks - The Planetary Datalinks (pp.85-110); Sören Meschede - Curators’ Network: Creating a Promotional Database for Contemporary Visual Arts (pp.11-130); Stefanie Kogler - Divergent Histories and Digital Archives of Latin American and Latino Art in the United States – Old Problems in New Digital Formats (pp.131-156); Luise Reitstätter e Florian Bettel - Right to the City! Right to the Museum!(pp.159-182); Roberto Terracciano - On Geo-poetic systems: virtual interventions inside and outside the museum space (pp.183-210); e, Catarina Carneiro de Sousa e Luís Eustáquio - Art Practice in Collaborative Virtual Environments (pp.211-240).
Resumo:
A sample of 445 consumers resident in distinct Lisbon areas was analyzed through direct observations in order to discover each lifestyle’s current proportion, applying the Whitaker Lifestyle™ Method. The findings of the conducted hypothesis tests on the population proportion unveil that Neo-Traditional and Modern Whitaker lifestyles have the significantly highest proportion, while the overall presence of different lifestyles varies across neighborhoods. The research further demonstrates the validity of Whitaker observation techniques, media consumption differences among lifestyles and the importance of style and aesthetics while segmenting consumers by lifestyles. Finally, market opportunities are provided for firms operating in Lisbon.
Resumo:
Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.
Resumo:
Social Media @ Galp Project had a very specific purpose – analyze the feasibility for Galp to enter in new Social Media platforms and, if appropriate, develop a short-term strategy for the entrance in which some guidelines are valid for the medium-long-term. As expected, the majority of the project was focused on the second part, which consists in an analysis of some aspects concerning the organization as well as in the relationship with customers and public in general
Resumo:
The purpose of this project was to study a possible presence of Galp at Social Media. The importance of this study appears as a consequence of the company’s need to adapt to a new mean of communication that is changing our society and the companies way of doing business. In the consulting labs, the analysis was done taking into account the best practices for business at Social Media and the singularities of the company. The output of this study was a collection of specific guidelines concerning several fields to develop a strategic presence at Social Media.
Resumo:
This study examined user-generated (UG) advertising in the context of social media networks. The focus was on how people, whether an expert in the area, a non-expert or a friend, influence the reader of the advertisement. Furthermore, the study analyzed how the certainty level of the UG advertisement influences the person viewing the ad. The study showed that for the friend source a high certainty message was more persuasive. However, regarding the certainty no significant results were found for the expert and non-expert. Further, the type of the source had a considerable impact on persuasion. Someone that we personally know (e.g., a friend) was rated most positive for all analyzes variables. This shows that with the rising usage of social media there are great opportunities for new effective advertising strategies that could include a new type of an endorser – friends.
Resumo:
This article aims to reconstruct the critical debate regarding the examination of the crisis in the disciplines of art history and criticism with a particular focus on the proposal formulated by U.S. theorists who contributed to October journal. The discrediting of many modernist critical methods, particularly that of Clement Greenberg – the formalist diktat – marked the birth of the journal and gave rise to proposals set forth by critics committed to a new approach. Their divergent positions, nonetheless, have contributed to undermining the traditional concepts of the autonomy of art and criticism. The proposals discussed over the course of publication were the result of a reappraisal of the disciplinary instruments of art history and criticism pursuant to the crucial cultural changes which took place in the 1980s.
Resumo:
This article addresses the work of Mizrahi women artists, i.e., Israeli-Jewish women of Asian or African ethnic origin, using the artist Vered Nissim as a case study. Nissim seeks to affirm the politics of identity and recognition, as well as feminism in order to create a paradigm shift with regards to the local regime of cultural representations in the Israeli art scene. Endeavouring to find ways of undermining the rigid imbalances between different social groups, she calls for a comprehensive reform of the status quo through artistic activism. Nissim employs a style, content, and medium that disrupts the accepted social order, using humour and irony as unique weapons with which she takes liberties with conventional moral, social, and economic values. Placing issues of race, class and gender at the centre of her work, she seeks to undermine and problematize essentialist attitudes, highlighting the political intersections of different identity categories as the critical analysis of intersectionality unfolds.
Resumo:
Both culture coverage and digital journalism are contemporary phenomena that have undergone several transformations within a short period of time. Whenever the media enters a period of uncertainty such as the present one, there is an attempt to innovate in order to seek sustainability, skip the crisis or find a new public. This indicates that there are new trends to be understood and explored, i.e., how are media innovating in a digital environment? Not only does the professional debate about the future of journalism justify the need to explore the issue, but so do the academic approaches to cultural journalism. However, none of the studies so far have considered innovation as a motto or driver and tried to explain how the media are covering culture, achieving sustainability and engaging with the readers in a digital environment. This research examines how European media which specialize in culture or have an important cultural section are innovating in a digital environment. Specifically, we see how these innovation strategies are being taken in relation to the approach to culture and dominant cultural areas, editorial models, the use of digital tools for telling stories, overall brand positioning and extensions, engagement with the public and business models. We conducted a mixed methods study combining case studies of four media projects, which integrates qualitative web features and content analysis, with quantitative web content analysis. Two major general-interest journalistic brands which started as physical newspapers – The Guardian (London, UK) and Público (Lisbon, Portugal) – a magazine specialized in international affairs, culture and design – Monocle (London, UK) – and a native digital media project that was launched by a cultural organization – Notodo, by La Fábrica – were the four case studies chosen. Findings suggest, on one hand, that we are witnessing a paradigm shift in culture coverage in a digital environment, challenging traditional boundaries related to cultural themes and scope, angles, genres, content format and delivery, engagement and business models. Innovation in the four case studies lies especially along the product dimensions (format and content), brand positioning and process (business model and ways to engage with users). On the other hand, there are still perennial values that are crucial to innovation and sustainability, such as commitment to journalism, consistency (to the reader, to brand extensions and to the advertiser), intelligent differentiation and the capability of knowing what innovation means and how it can be applied, since this thesis also confirms that one formula doesn´t suit all. Changing minds, exceeding cultural inertia and optimizing the memory of the websites, looking at them as living, organic bodies, which continuously interact with the readers in many different ways, and not as a closed collection of articles, are still the main challenges for some media.
Resumo:
The use of chemical analysis of microbial components, including proteins, became an important achievement in the 80’s of the last century to the microbial identification. This led a more objective microbial identification scheme, called chemotaxonomy, and the analytical tools used in the field are mainly 1D/2D gel electrophoresis, spectrophotometry, high-performance liquid chromatography, gas chromatography, and combined gas chromatography-mass spectrometry. The Edman degradation reaction was also applied to peptides sequence giving important insights to the microbial identification. The rapid development of these techniques, in association with knowledge generated by DNA sequencing and phylogeny based on rRNA gene and housekeeping genes sequences, boosted the microbial identification to an unparalleled scale. The recent results of mass spectrometry (MS), like Matrix-Assisted Laser Desorption/Ionisation Time-of-Flight (MALDI-TOF), for rapid and reliable microbial identification showed considerable promise. In addition, the technique is rapid, reliable and inexpensive in terms of labour and consumables when compared with other biological techniques. At present, MALDI-TOF MS adds an additional step for polyphasic identification which is essential when there is a paucity of characters or high DNA homologies for delimiting very close related species. The full impact of this approach is now being appreciated when more diverse species are studied in detail and successfully identified. However, even with the best polyphasic system, identification of some taxa remains time-consuming and determining what represents a species remains subjective. The possibilities opened with new and even more robust mass spectrometers combined with sound and reliable databases allow not only the microbial identification based on the proteome fingerprinting but also include de novo specific proteins sequencing as additional step. These approaches are pushing the boundaries in the microbial identification field.
Resumo:
Dissertação de mestrado integrado em Engenharia de Telecomunicações e Informática
Resumo:
Dissertação de mestrado em Administração da Justiça
Resumo:
Dissertação de mestrado em Media Interativos