942 resultados para Art Built In
Resumo:
TESSA is a toolkit for experimenting with sensory augmentation. It includes hardware and software to facilitate rapid prototyping of interfaces that can enhance one sense using information gathered from another sense. The toolkit contains a range of sensors (e.g. ultrasonics, temperature sensors) and actuators (e.g. tactors or stereo sound), designed modularly so that inputs and outputs can be easily swapped in and out and customized using TESSA’s graphical user interface (GUI), with “real time” feedback. The system runs on a Raspberry Pi with a built-in touchscreen, providing a compact and portable form that is amenable for field trials. At CHI Interactivity, the audience will have the opportunity to experience sensory augmentation effects using this system, and design their own sensory augmentation interfaces.
Resumo:
This paper describes a fast integer sorting algorithm, herein referred as Bit-index sort, which is a non-comparison sorting algorithm for partial per-mutations, with linear complexity order in execution time. Bit-index sort uses a bit-array to classify input sequences of distinct integers, and exploits built-in bit functions in C compilers supported by machine hardware to retrieve the ordered output sequence. Results show that Bit-index sort outperforms in execution time to quicksort and counting sort algorithms. A parallel approach for Bit-index sort using two simultaneous threads is included, which obtains speedups up to 1.6.
Resumo:
Subdermal magnetic implants originated as an art form in the world of body modification. To date an in depth scientific analysis of the benefits of this implant has yet to be established. This research explores the concept of sensory extension of the tactile sense utilising this form of implantation. This relatively simple procedure enables the tactile sense to respond to static and alternating magnetic fields. This is not to say that the underlying biology of the system has changed; i.e. the concept does not increase our tactile frequency response range or sensitivity to pressure, but now does invoke a perceptual response to a stimulus that is not innately available to humans. Within this research two social surveys have been conducted in order to ascertain one, the social acceptance of the general notion of human enhancement, and two the perceptual experiences of individuals with the magnetic implants themselves. In terms of acceptance to the notion of sensory improvement (via implantation) ~39% of the general population questioned responded positively with a further ~25% of the respondents answering with the indecisive response. Thus with careful dissemination a large proportion of individuals may adopt this technology much like this if it were to become available for consumers. Interestingly of the responses collected from the magnetic implants survey ~60% of the respondents actually underwent the implant for magnetic vision purposes. The main contribution of this research however comes from a series of psychophysical testing. In which 7 subjects with subdermal magnetic implants, were cross compared with 7 subjects that had similar magnets superficially attached to their dermis. The experimentation examined multiple psychometric thresholds of the candidates including intensity, frequency and temporal. Whilst relatively simple, the experimental setup for the perceptual experimentation conducted was novel in that custom hardware and protocols were created in order to determine the subjective thresholds of the individuals. Abstract iv The overall purpose of this research is to utilise this concept in high stress scenarios, such as driving or piloting; whereby alerts and warnings could be relayed to an operator without intruding upon their other (typically overloaded) exterior senses (i.e. the auditory and visual senses). Hence each of the thresholding experiments were designed with the intention of utilising the results in the design of signals for information transfer. The findings from the study show that the implanted group of subjects significantly outperformed the superficial group in the absolute intensity threshold experiment, i.e. the implanted group required significantly less force than the superficial group in order to perceive the stimulus. The results for the frequency difference threshold showed no significant difference in the two groups tested. Interestingly however at low frequencies, i.e. 20 and 50 Hz, the ability of the subjects tested to discriminate frequencies significantly increased with more complex waveforms i.e. square and sawtooth, when compared against the typically used sinewave. Furthermore a novel protocol for establishing the temporal gap detection threshold during a temporal numerosity study has been established in this thesis. This experiment measured the subjects’ capability to correctly determine the number of concatenated signals presented to them whilst the time between the signals, referred to as pulses, tended to zero. A significant finding was that when altering the length of, the frequency of, and the number of cycles of the pulses, the time between pulses for correct recognition altered. This finding will ultimately aid in the design of the tactile alerts for this method of information transfer. Preliminary development work for the use of this method of input to the body, in an automotive scenario, is also presented within this thesis in the form of a driving simulation. The overall goal of which is to present warning alerts to a driver, such as rear-to-end collision, or excessive speeds on roads, in order to prevent incidents and penalties from occurring. Discussion on the broader utility of this implant has been presented, reflecting on its potential use as a basis for vibrotactile, and sensory substitution, devices. This discussion furthers with postulations on its use as a human machine interface, as well as how a similar implant could be used within the ear as a hearing aid device.
Resumo:
Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.
Resumo:
Architectures based on Coordinated Atomic action (CA action) concepts have been used to build concurrent fault-tolerant systems. This conceptual model combines concurrent exception handling with action nesting to provide a general mechanism for both enclosing interactions among system components and coordinating forward error recovery measures. This article presents an architectural model to guide the formal specification of concurrent fault-tolerant systems. This architecture provides built-in Communicating Sequential Processes (CSPs) and predefined channels to coordinate exception handling of the user-defined components. Hence some safety properties concerning action scoping and concurrent exception handling can be proved by using the FDR (Failure Divergence Refinement) verification tool. As a result, a formal and general architecture supporting software fault tolerance is ready to be used and proved as users define components with normal and exceptional behaviors. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This study was undertaken to evaluate the prevalence of GB virus C (GBV-C) viraemia and anti-E2 antibody, and to assess the effect of co-infection with GBV-C and HIV during a 10-year follow-up of a cohort of 248 HIV-infected women. Laboratory variables (mean and median CD4 counts, and HIV and GBV-C viral loads) and clinical parameters were investigated. At baseline, 115 women had past exposure to GBV-C: 57 (23%) were GBV-C RNA positive and 58 (23%) were anti-E2 positive. There was no statistical difference between the groups (GBV-C RNA + /anti-E2 -, GBV-C RNA - /anti-E2 + and GBV-C RNA - /anti-E2 -) regarding baseline CD4 counts or HIV viral loads (P = 0.360 and 0.713, respectively). Relative risk of death for the GBV-C RNA + /anti-E2 - group was 63% lower than that for the GBV-C RNA - /anti-E2 - group. Multivariate analysis demonstrated that only HIV loads >= 100,000 copies/mL and AIDS-defining illness during follow-up were associated with shorter survival after AIDS development. It is likely that antiretroviral therapy (ART) use in our cohort blurred a putative protective effect related to the presence of GBV-C RNA.
Resumo:
If you move from Sweden to Portugal to work as an Art Director at an Ad agency, there are certain things you ought to know. Perhaps even before you decide to move. There are a lot of things that are quite different if you compare Portugal to Sweden, not only the language, but also the design, software, working hours and much more. You have for example two month with double salary, you work till six p.m. and when it comes to maternity leave, you have 100% compensation with no taxes the first four months of every pregnancy. The tasks you have to handle as an Art Director in Portugal may, as in Sweden, be various. The difference is more how you solve your task and what the design looks like. As a Swede in Portugal the solution will probably look like a mixture between the Portuguese “expressive” and the Swedish “clean” design. This degree project discusses questions that might turn up when a Swede gets employed as an Art Director in Portugal. What will the salary be? When, in the morning, is it time to show up at the office? How many weeks of vacations are entitled? There are a lot of distinctions, but also a lot of similarities. The report is mainly based upon the author’s own experiences of the Ad agency McCann-Erickson in Lisbon. The history of McCann-Erickson starts in New York, 1930 and grew rapidly into one of the leading corporation with agencies all over the world. It is still today one of the world’s largest Ad company with clients like Coca-Cola and General Motors, and the slogan “truth well told” still lights up all their agencies after almost a century, just like a leading star.
Resumo:
IPTV is now offered by several operators in Europe, US and Asia using broadcast video over private IP networks that are isolated from Internet. IPTV services rely ontransmission of live (real-time) video and/or stored video. Video on Demand (VoD)and Time-shifted TV are implemented by IP unicast and Broadcast TV (BTV) and Near video on demand are implemented by IP multicast. IPTV services require QoS guarantees and can tolerate no more than 10-6 packet loss probability, 200 ms delay, and 50 ms jitter. Low delay is essential for satisfactory trick mode performance(pause, resume,fast forward) for VoD, and fast channel change time for BTV. Internet Traffic Engineering (TE) is defined in RFC 3272 and involves both capacity management and traffic management. Capacity management includes capacityplanning, routing control, and resource management. Traffic management includes (1)nodal traffic control functions such as traffic conditioning, queue management, scheduling, and (2) other functions that regulate traffic flow through the network orthat arbitrate access to network resources. An IPTV network architecture includes multiple networks (core network, metronetwork, access network and home network) that connects devices (super head-end, video hub office, video serving office, home gateway, set-top box). Each IP router in the core and metro networks implements some queueing and packet scheduling mechanism at the output link controller. Popular schedulers in IP networks include Priority Queueing (PQ), Class-Based Weighted Fair Queueing (CBWFQ), and Low Latency Queueing (LLQ) which combines PQ and CBWFQ.The thesis analyzes several Packet Scheduling algorithms that can optimize the tradeoff between system capacity and end user performance for the traffic classes. Before in the simulator FIFO,PQ,GPS queueing methods were implemented inside. This thesis aims to implement the LLQ scheduler inside the simulator and to evaluate the performance of these packet schedulers. The simulator is provided by ErnstNordström and Simulator was built in Visual C++ 2008 environmentand tested and analyzed in MatLab 7.0 under windows VISTA.
Resumo:
Allt hårdare lagkrav gör att det är svårt att energieffektivisera befintliga byggnader utan att förändra deras utseende. Syftet med examensarbetet är att utreda hur stor energieffektivisering, för tre befintliga småhus uppförda under 1900-talet, som är möjlig att uppnå genom förbättring av byggnadernas klimatskal, det vill säga tak, väggar, golv, fönster och dörrar, utan att förvanska byggnadernas utseende och samtidigt bevara deras kulturhistoriska värden. Arbetet bestod av en förstudie där tre byggnader identifierades, ett undersökningsskede där information om byggnaderna togs fram och ett slutsatsskede där energibesparande åtgärdsförslag togs fram och utvärderades. Byggnader som var goda representanter för sin tid och stil söktes. Byggnader från 1910-talet, 1930-talet och 1970-talet, lokaliserades. Sedan gjordes det fallstudier med intervjuer och inventeringar. För att utreda byggnadens klimatskal utfördes u-värdesberäkningar och energiberäkningar av befintliga byggander och byggnader baserade på föreslagna åtgärdsförslag. Ingen av byggnaderna nådde efter föreslagna åtgärder ner till passivhuskravet 59 kWh/år/m2 Atemp eller BBR-kravet 110 kWh/år/m2 Atemp för en byggnads specifika energianvändning. Den största möjliga energieffektivisering för de tre byggnaderna uppförda under 1900-talet, som är möjlig att uppnå utan att förvanska byggnadernas utseende och samtidigt bevara deras kulturhistoriska värden är 13,0 kWh/år/m2 Atemp, 49,7 kWh/år/m2 Atemp respektive 64,8 kWh/år/m2 Atemp. Slutsatser från arbetet är att byggnader från 1910-tal kan åtgärdas genom att isolera fönstren, sätta dit en extra dörr på insidan av ytterdörren samt tilläggsisolera snedtaket. Byggnader från 1930-tal kan åtgärdas genom att isolera fönstren med en isolerruta på insidan av fönstret och dörrarna med en extra dörr på insidan av ytterdörren. Byggnader från 1970-tal kan åtgärda fönstren genom att byta ut dem till energifönster, ingen åtgärd för golvet men fasaden isoleras utvändigt med vakuumisolering. Byggnaden från 1970-talet klarade sig bäst i jämförelsen eftersom den var i autentiskt skick från början vilket gjorde att förbättringen blev större än för till exempel byggnaden från 1910-talet som redan var ombyggd innan åtgärder föreslogs.
Resumo:
Ukraine has repeatedly shifted between the two sub-types of semi-presidentialism, i.e. between premier-presidentialism and president-parliamentarism. The aim of this article is to discuss to what extent theoretical arguments against premier-presidential and president-parliamentary systems are relevant for understanding the shifting directions of the Ukrainian regime. As a point of departure, I formulate three main claims from the literature: 1) “President-parliamentarism is less conducive to democratization than premier-presidentialism.”; 2) “Semi-presidentialism in both its variants have built-in incitements for intra-executive conflict between the president and the prime minister.”; 3) “Semi-presidentialism in general, and president-parliamentarism in particular, encourages presidentialization of political parties.” I conclude from the study’s empirical overview that the president-parliamentary system– the constitutional arrangement with the most dismal record of democratization – has been instrumental in strengthening presidential dominance and authoritarian tendencies. The premier-presidential period 2006–2010 was by no means smooth and stable, but the presidential dominance weakened and the survival of the government was firmly anchored in the parliament. During this period, there were also indications of a gradual strengthening of institutional capacity among the main political parties and the parliament began to emerge as a significant political arena.
Resumo:
A system built in terms of autonomous agents may require even greater correctness assurance than one which is merely reacting to the immediate control of its users. Agents make substantial decisions for themselves, so thorough testing is an important consideration. However, autonomy also makes testing harder; by their nature, autonomous agents may react in different ways to the same inputs over time, because, for instance they have changeable goals and knowledge. For this reason, we argue that testing of autonomous agents requires a procedure that caters for a wide range of test case contexts, and that can search for the most demanding of these test cases, even when they are not apparent to the agents’ developers. In this paper, we address this problem, introducing and evaluating an approach to testing autonomous agents that uses evolutionary optimization to generate demanding test cases.
Resumo:
The goal of this paper is to investigate how the Untied States federal government, specifically through the National Endowment for the Arts, or NEA, has acted in the position of an arts patron in the past few decades. Specifically, this paper will focus on the past decade and a half since the 'arts crisis' of the late 1980s and the social and political backlash against the art community in the 1990s, which was only against ‘offensive’ art that was seen as morally and culturally corruptive. I explore the political, social, and economic forms the backlash took, particularly rooted in a perceived fear of degenerative arts as a corruption of and a catalyst for the eventual collapse of American culture and values. Additionally, I analyse the role the federal government played in ‘ameliorating’ the situation. I investigate how state arts patronage has affected and continues to affect both the concepts behind and the manifestations of art, as well as who is encouraged, sanctioned, or neglected in the production of art. To accomplish this, I explore how and why the federal government employs the arts to define and redefine morality and culture, and how does it express/allow the expressions of these through art.
Resumo:
As far back as I can remember, I have always been interested in studio art. Whether it be painting, drawing, printmaking, or photography, it has consistently been a part of my life. Upon enrolling in Colby, I became interested in computers and decided to major my undergraduate college career in Computer Science. Not forgetting past interests, I continued my studio art education, taking several classes within the Art department. In due time, I began combining interests and began studying Computer Graphics and Design. With limited resources in this field at Colby, the majority of my computer graphic education and experience has been done on my own time apart from regular classroom work. As time progressed, so did my interests. Starting with simple image manipulation of digitally scanned photographs, I moved on to Web Page design, eventually leading to Desktop Publishing. Ultimately, I wanted to take a step further and expand my overall computer graphic knowledge by learning 3D modeling and animation. With even fewer resources in 3D animation at Colby, I perceived having trouble finding the information and tools I would need to gain the necessary skills for this new field. The Senior Scholars program gave me the opponunity to find and acquire the necessary tools to pursue my interest. This program also allowed me to devote the proper amount of time required for learning these new tools.
Resumo:
Iniciamos o presente trabalho fazendo um estudo comparativo entre duas células de carga de compressão, ambas em formato cilíndrico, possuindo internamente, à meia altura do corpo, uma placa engastada. Estas duas células de carga são de aço SAE 1045; para uma delas adaptamos um modelo matemático teórico, através do qual foram pré-determinadas as suas dimensões, objetivando em comportamento ótimo no que se refere às distribuições de tensões e deformações, esta denominamos de Célula de Carga I. Na outra, denominada de Célula de Carga II, alteramos algumas dimensões, com a finalidade de comparar seu comportamento em relação à primeira. Também, no transcorrer do trabalho analisamos e comparamos os resultados matemáticos com os valores práticos encontrados. Frente aos resultados obtidos neste estudo prévio, projetamos, construímos e analisamos cinco outras células de carga (Células de Carga III, IV, V, VI e VII), em termos de geometria, material e tratamento térmico, visando o aperfeiçoamento de tais transdutores de força no que concerne a usinagem, resposta de sinal elétrico, limitações e aplicações industriais.
Resumo:
Este trabalho discute a aplicação de modelos de Pesquisa Operacional como base para o gerenciamento de operações e análise de tecnologias em empresa de transporte de carga. Inicialmente, busca-se contextualizar o escopo deste trabalho dentro do ambiente competitivo atual a fim de que sejam percebidas as conexões entre os sistemas de operações vigentes e os modelos apresentados. Também é feita uma discussão preliminar do conceito de logística, visto que este enseja uma série de acepções para o termo. Em seguida é apresentada uma revisão de algumas ferramentas de modelagem que usualmente são aplicadas a problemas de suprimento, distribuição e transbordo de produtos. Com base nesta revisão é proposta uma abordagem hierarquizada para o tratamento de problemas de logística. Um estudo de caso realizado em uma empresa de transporte de carga busca então aplicar algumas ferramentas de Pesquisa Operacional nos diversos níveis de gerenciamento de operações. Por fim, o trabalho verifica as potencialidades do uso de uma abordagem hierarquizada e a importância da adoção de modelos em empresas de transporte de carga como forma de alavancagem da sua competividade frente as novas exigências do mercado.