884 resultados para Digital video
Resumo:
With the availability of a huge amount of video data on various sources, efficient video retrieval tools are increasingly in demand. Video being a multi-modal data, the perceptions of ``relevance'' between the user provided query video (in case of Query-By-Example type of video search) and retrieved video clips are subjective in nature. We present an efficient video retrieval method that takes user's feedback on the relevance of retrieved videos and iteratively reformulates the input query feature vectors (QFV) for improved video retrieval. The QFV reformulation is done by a simple, but powerful feature weight optimization method based on Simultaneous Perturbation Stochastic Approximation (SPSA) technique. A video retrieval system with video indexing, searching and relevance feedback (RF) phases is built for demonstrating the performance of the proposed method. The query and database videos are indexed using the conventional video features like color, texture, etc. However, we use the comprehensive and novel methods of feature representations, and a spatio-temporal distance measure to retrieve the top M videos that are similar to the query. In feedback phase, the user activated iterative on the previously retrieved videos is used to reformulate the QFV weights (measure of importance) that reflect the user's preference, automatically. It is our observation that a few iterations of such feedback are generally sufficient for retrieving the desired video clips. The novel application of SPSA based RF for user-oriented feature weights optimization makes the proposed method to be distinct from the existing ones. The experimental results show that the proposed RF based video retrieval exhibit good performance.
Resumo:
Digital holography is the direct recording of holograms using a CCD camera and is an alternative to the use of a film or a plate. In this communication in-line digital holographic microscopy has been explored for its application in particle imaging in 3D. Holograms of particles of about 10 mu m size have been digitally reconstructed. Digital focusing was done to image the particles in different planes along the depth of focus. Digital holographic particle imaging results were compared with conventional optical microscope imaging. A methodology for dynamic analysis of microparticles in 3D using in-line digital holography has been proposed.
Resumo:
Online communities have fundamentally changed how humans connected and are now so common they are fundamental to the human experience. As the Internet developed for Web 1.0 to Web 2.0, the functionality of these communities has far exceeded initial expectations. These communities have shifted from simply places to share information to ways to access products and services that bridge the online and offline worlds. This shift has led to the disruption of many industries with the transportation industry being one such sector. Both private transport providers and public transport systems face competition from online communities who are able to link services providers and customers more effectively and innovatively. These types of communities fall under what has been popularised as collaborative consumption or the sharing economy. The aim of this study is to explore the role of Design-led Innovation in the creation of digital futures, specifically online connected communities for successful new mobility solutions. To explore this proposition multiple data collection methods are proposed;Content Analysis, ii) A Comparative Qualitative Study consisting of Qualitative Interviews and Focus Groups / Design Workshops and iii) An Action Research Cycle of Embedded Practice. The multidisciplinary nature of this study grounds this research in a novel position contributing to new knowledge in both the field of design, and also a deeper understanding of the larger fast-growing online community phenomena.
Resumo:
With the introduction of the PCEHR (Personally Controlled Electronic Health Record), the Australian public is being asked to accept greater responsibility for the management of their health information. However, the implementation of the PCEHR has occasioned poor adoption rates underscored by criticism from stakeholders with concerns about transparency, accountability, privacy, confidentiality, governance, and limited capabilities. This study adopts an ethnographic lens to observe how information is created and used during the patient journey and the social factors impacting on the adoption of the PCEHR at the micro-level in order to develop a conceptual model that will encourage the sharing of patient information within the cycle of care. Objective: This study aims to firstly, establish a basic understanding of healthcare professional attitudes toward a national platform for sharing patient summary information in the form of a PCEHR. Secondly, the studies aims to map the flow of patient related information as it traverses a patient’s personal cycle of care. Thus, an ethnographic approach was used to bring a “real world” lens to information flow in a series of case studies in the Australian healthcare system to discover themes and issues that are important from the patient’s perspective. Design: Qualitative study utilising ethnographic case studies. Setting: Case studies were conducted at primary and allied healthcare professionals located in Brisbane Queensland between October 2013 and July 2014. Results: In the first dimension, it was identified that healthcare professionals’ concerns about trust and medico-legal issues related to patient control and information quality, and the lack of clinical value available with the PCEHR emerged as significant barriers to use. The second dimension of the study which attempted to map patient information flow identified information quality issues, clinical workflow inefficiencies and interoperability misconceptions resulting in duplication of effort, unnecessary manual processes, data quality and integrity issues and an over reliance on the understanding and communication skills of the patient. Conclusion: Opportunities for process efficiencies, improved data quality and increased patient safety emerge with the adoption of an appropriate information sharing platform. More importantly, large scale eHealth initiatives must be aligned with the value proposition of individual stakeholders in order to achieve widespread adoption. Leveraging an Australian national eHealth infrastructure and the PCEHR we offer a practical example of a service driven digital ecosystem suitable for co-creating value in healthcare.
Resumo:
This paper presents the new trend of FPGA (Field programmable Gate Array) based digital platform for the control of power electronic systems. There is a rising interest in using digital controllers in power electronic applications as they provide many advantages over their analog counterparts. A board comprising of Cyclone device EP1C12Q240C8 of Altera is used for developing this platform. The details of this board are presented. This developed platform can be used for the controller applications such as UPS, Induction Motor drives and front end converters. A real time simulation of a system can also be done. An open-loop induction motor drive has been implemented using this board and experimental results are presented.
Resumo:
Data generated via user activity on social media platforms is routinely used for research across a wide range of social sciences and humanities disciplines. The availability of data through the Twitter APIs in particular has afforded new modes of research, including in media and communication studies; however, there are practical and political issues with gaining access to such data, and with the consequences of how that access is controlled. In their paper ‘Easy Data, Hard Data’, Burgess and Bruns (2015) discuss both the practical and political aspects of Twitter data as they relate to academic research, describing how communication research has been enabled, shaped and constrained by Twitter’s “regimes of access” to data, the politics of data use, and emerging economies of data exchange. This conceptual model, including the ‘easy data, hard data’ formulation, can also be applied to Sina Weibo. In this paper, we build on this model to explore the practical and political challenges and opportunities associated with the ‘regimes of access’ to Weibo data, and their consequences for digital media and communication studies. We argue that in the Chinese context, the politics of data access can be even more complicated than in the case of Twitter, which makes scientific research relying on large social data from this platform more challenging in some ways, but potentially richer and more rewarding in others.
Resumo:
A high speed digital signal averager with programmable features for the sampling period, for the number of channels and for the number of sweeps is described. The system implements a stable averaging algorithm (Deadroff and Trimble 1968) to provide a stable, calibrated display. The performance of the instrument has been evaluated for the reduction of random noise and for comb-filter action. Special uses of the instrument as a box-car integrator and as a transient recorder are also indicated.
Resumo:
The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.
Resumo:
New Video Gamer: Africa Needs More Technology (CNN 12/12/2011) In December 2011 CNN news service online edition (Sutter, 2011) posted a short item about Cwi Nqane, a Khoisan man who entered Samsung’s Namibian World Cyber Games (WCG) heats held at the 2011 the annual Windhoek Show. Cwi Nqane won a place on the Namibian WCG team playing a smartphone game called Asphalt 6: Adrena-line (Gameloft, 2011). Cwi was presented with a ‘top of the line’ Samsung Galaxy tablet and subsequently sent to compete in Korea. Later, other news and game news websites re-reported the incident, which inspired a variety of enthusiastic comment about tech-nology and ‘new knowledge’. Then Kotaku news service picked up the item (Narcisse, 2011) and took a very different slant. Kotaku proposed that Samsung was exploiting Cwi and had assumed the role of a Techno-Tarzan: “striding into Nqane’s homeland and swinging him off into the wonders of the modern world where they can trot him out as a curiosity”. These two perspectives on the story of Cwi’s WCG entry expose two dominant views on Indigenous knowledges and technologies: ICTs as progress for in-digenous peoples and ICTs as disruptive and exploitative. Neither position, however, allows for the claiming of digital technology by indigenous communities, indeed both views position indigenous cultures as being outsiders.
Resumo:
The integration of digital technologies in pedagogy is positioned as an important change in education, but widespread innovative use of digital technologies is yet to be truly realised. The gap between the potential and the reality of digital technology integration is commonly attributed to a range of challenging extrinsic and intrinsic influences. Activity Theory (Engeström, 2009) is used to analyse challenges created by extrinsic influences (Nielsen, Miller, & Hoban, 2012); a complementary theory is needed to conceptualise intrinsic influences. System 1 and System 2 thinking theory (Kahneman, 2011) will be advanced as a conceptual framework for understanding conscious and unconscious aspects of teacher practice, particularly the interaction between innovation and teacher routine, attitudes and beliefs. Transformative Learning Theory (Mezirow, 2009) will be positioned to comprehend the nexus of extrinsic and intrinsic influences. This paper will propose how, when faced with extrinsic and intrinsic influences on innovative practice, educators can use these theories to conceptualise the challenge of integrating digital technologies in pedagogy.
Resumo:
Social marketing uses commercial marketing techniques to deliver interventions for social benefit in order to improve quality of life for individuals and communities. Behaviour change is the primary objective of social marketing interventions (Andreasen 2002). The aim of this systematic review is to provide insight into social marketing interventions and their evaluations published in peer-reviewed journals so as to identify the key elements of social marketing employed by these interventions. as well as understand the use of digital channels for engagement.
Resumo:
Tutkielma käsittelee suomalaisten televisiotekstittäjien ammatillisuutta, käännösprosessia ja digitaalisten tekstitysohjelmien vaikutuksia tekstitysprosessiin ammattitekstittäjien näkökulmasta. Suomen television digitalisoituminen on aiheuttanut mullistuksia myös tekstitysalalla kun tekstitettävä kuvamateriaali on ryhdytty toimittamaan käännöstoimistoille ja tekstittäjille digitaalisena. Teoriaosuudessa käsitellään käännös- ja tekstitystutkimusta sekä koulutusta Suomessa, ammattitaitoa ja ammatillisuutta sekä kääntämisen apukeinoja. Tekstittäminen esitellään erikoistuneena kääntämisen muotona. On kuitenkin myös huomioitava, että kääntäminen on yksi vaihe tekstitysprosessissa. Teoriaosuus päättyy suomalaisten televisiotekstittäjien arjen ja työkentän nykytilanteen käsittelyyn – tekstittäjät työskentelevät monenlaisilla työehdoilla ja laadun kriteerit saatetaan joutua arvioimaan uudelleen. Empiirisen osan alussa esitetään, että suomalaisia televisiotekstittäjiä on haastateltu yllättävän vähän, ja Jääskeläisen ajatuksiin nojaten mainitaan, että tekstittämisen alalla on vielä paljon tutkimatta – etenkin suomalaisesta tekstitysprosessista löytyy tutkittavaa. Tutkimuskohde on ammatikseen televisioon tekstityksiä tekevät kääntäjät. Suomalaiselle tekstitykseen erikoistuneelle käännöstoimistolle työskenteleville tekstittäjille lähetettiin alkutalvesta 2008 kyselylomake, jolla kartoitettiin sekä monivalintakysymyksillä että avoimilla kysymyksillä heidän ammatillisuuttaan, työmenetelmiään, käännös- ja tekstitysprosessiaan, ammattiylpeyttään ja -identiteettiään, ajanhallintaansa, sekä heidän käyttämäänsä digitaalista tekstitysohjelmaa. Tutkimuksessa kävi ilmi, että lähes kolmanneksella vastaajista on ammatistaan neutraali tai jopa negatiivinen käsitys. Näitä tekstittäjiä yhdistää se seikka, että kaikilla on alle 5 vuotta kokemusta alalta. Valtaosa vastanneista on kuitenkin ylpeitä siitä, että toimivat suomen kielen ammattilaisina. Tekstitysprosessi oli lomakkeessa jaettu esikatseluvaiheeseen, käännösvaiheeseen, ajastamisvaiheeseen ja korjauskatseluvaiheeseen. Tekstittäjät pyydettiin mm. arvioimaan tekstitysprosessinsa kokonaiskestoa. Kestoissa ilmeni suuria eroavaisuuksia, joista ainakin osa korreloi kokemuksen kanssa. Runsas puolet vastaajista on hankkinut digitaalisen tekstitysohjelmiston käyttöönsä ja osa ajastaa edelleen käännöstoimistossa muun muassa ohjelmiston kalleuden vuoksi. Digitaalisen ohjelmiston myötä tekstitysprosessiin ja työkäytänteisiin on tullut muutoksia, kun videonauhureista ja televisioista on siirrytty pelkän tietokoneen käyttöön. On mahdollista tehdä etätyötä kaukomailta käsin, kääntää ja ajastaa lomittain tai tehdä esiajastus ja kääntää sitten. Digitaalinen tekniikka on siis mahdollistanut tekstitysprosessin muuttumisen ja vaihtoehtoiset työmenetelmät, mutta kaikista menetelmistä ei välttämättä ole tekstittäjälle hyötyä. Perinteinen tekstitysprosessi (esikatselu, repliikkijakojen merkitseminen käsikirjoitukseen, kääntäminen ja repliikkien laadinta, korjaukset ja tarkastuskatselu) vaikuttaa edelleen tehokkaimmalta. Vaikka työkäytänteet eroavat toisistaan, kokonaiskäsitys on se, että digitalisoitumisen alkukangertelujen jälkeen tekstittäjien työskentely on tehostunut.
Resumo:
At present, the most reliable method to obtain end-user perceived quality is through subjective tests. In this paper, the impact of automatic region-of-interest (ROI) coding on perceived quality of mobile video is investigated. The evidence, which is based on perceptual comparison analysis, shows that the coding strategy improves perceptual quality. This is particularly true in low bit rate situations. The ROI detection method used in this paper is based on two approaches: - (1) automatic ROI by analyzing the visual contents automatically, and; - (2) eye-tracking based ROI by aggregating eye-tracking data across many users, used to both evaluate the accuracy of automatic ROI detection and the subjective quality of automatic ROI encoded video. The perceptual comparison analysis is based on subjective assessments with 54 participants, across different content types, screen resolutions, and target bit rates while comparing the two ROI detection methods. The results from the user study demonstrate that ROI-based video encoding has higher perceived quality compared to normal video encoded at a similar bit rate, particularly in the lower bit rate range.
Resumo:
Complex amplitude encoded in any digital hologram must undergo quantization, usually in either polar or rectangular format . In this paper these two schemes are compared under the constraints and conditions inherent in digital holography . For Fourier transform holograms when the spectrum is levelled through phase coding, the rectangular format is shown to be optimal . In the absence of phase coding, and also if the amplitude spectrum has a large dynamic range, the polar format may be preferable .