381 resultados para User generated contents (UGC)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores novel driving experiences that make use of gamification and augmented reality in the car. We discuss our design considerations, which are grounded in road safety psychology and video game design theory. We aim to address the tension between safe driving practices and player engagement. Specifically, we propose a holistic, iterative thinking process inspired by game design cognition and share our insights generated through the application of this process. We present preliminary game concepts that blend digital components with physical elements from the driving environment. We further highlight how this design process helped us to iteratively evolve these concepts towards being safer while maintaining fun. These insights and game design cognition itself will be useful to the AutomotiveUI community investigating similar novel driving experiences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first User-Focused Service Engineering, Consumption and Aggregation workshop (USECA) in 2011 was held in conjunction with the WISE 2011 conference in Sydney, Australia. Web services and related technology are a widely accepted standard architectural paradigm for application development. The idea of reusing existing software components to build new applications has been well documented and supported for the world of enterprise computing and professional developers. However, this powerful idea has not been transferred to end-users who have limited or no computing knowledge. The current methodologies, models, languages and tools developed for Web service composition are suited to IT professionals and people with years of training in computing technologies. It is still hard to imagine any of these technologies being used by business professionals, as opposed to computing professionals. © 2013 Springer-Verlag.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Section 180 of the Property Law Act 1974 (Qld) makes provision for an applicant to seek a statutory right of user over a neighbour’s property where such right of use is reasonably necessary in the interests of effective use in any reasonable manner of the dominant land. A key issue in an application under s 180 is compensation. Unfortunately, while s 180 expressly contemplates that an order for compensation will include provision for payment of compensation to the owner of servient land there are certain issues that are less clear. One of these is the basis for determination of the amount of compensation. In this regard, s 180(4)(a) provides that, in making an order for a statutory right of user, the court: (a) shall, except in special circumstances, include provision for payment by the applicant to such person or persons as may be specified in the order of such amount by way of compensation or consideration as in the circumstances appears to the court to be just The operation of this statutory provision was considered by de Jersey CJ (as he then was) in Peulen v Agius [2015] QSC 137.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysing the engagement of students in university-based Facebook groups can shed light on the nature of their learning experience and highlight leverage points to build on student success. While post-semester surveys and demographic participation data can highlight who was involved and how they subsequently felt about the experience, these techniques do not necessarily reflect real-time engagement. One way to gain insight into in-situ student experiences is by categorising the original posts and comments into predetermined frameworks of learning. This paper offers a systematic method of coding Facebook contributions within various engagement categories: motivation, discourse, cognition and emotive responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, drug use has been understood as a problem of epidemiology, psychiatry, physiology, and criminality requiring legal and medical governance. Consequently drug research tends to be underpinned by an imperative to better govern, and typically proposes policy interventions to prevent or solve drug problems. We argue that categories of ‘addictive’ and ‘recreational’ drug use are discursive forms of governance that are historically, politically and socially contingent. These constructions of the drug problem shape what drug users believe about themselves and how they enact these beliefs in their drug use practices. Based on qualitative interviews with young illicit drug users in Brisbane, Australia, this paper uses Michel Foucault’s concept of governmentality to provide insights into how the governance of illicit drugs intersects with self-governance to create a drug user self. We propose a reconceptualisation of illicit drug use that takes into account the contingencies and subjective factors that shape the drug experience. This allows for an understanding of the relationships between discourses, policies, and practices in constructions of illicit drug users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Chlamydia (C.) trachomatis is the most prevalent bacterial sexually transmitted infection worldwide and the leading cause of preventable blindness. Genetic approaches to investigate C. trachomatis have been only recently developed due to the organism’s intracellular developmental cycle. HtrA is a critical stress response serine protease and chaperone for many bacteria and in C. trachomatis has been previously shown to be important for heat stress and the replicative phase of development using a chemical inhibitor of the CtHtrA activity. In this study, chemically-induced SNVs in the cthtrA gene that resulted in amino acid substitutions (A240V, G475E, and P370L) were identified and characterized. Methods SNVs were initially biochemically characterized in vitro using recombinant protein techniques to confirm a functional impact on proteolysis. The C. trachomatis strains containing the SNVs with marked reductions in proteolysis were investigated in cell culture to identify phenotypes that could be linked to CtHtrA function. Results The strain harboring the SNV with the most marked impact on proteolysis (cthtrAP370L) was detected to have a significant reduction in the production of infectious elementary bodies. Conclusions This provides genetic evidence that CtHtrA is critical for the C. trachomatis developmental cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electronic cigarette-generated mainstream aerosols were characterized in terms of particle number concentrations and size distributions through a Condensation Particle Counter and a Fast Mobility Particle Sizer spectrometer, respectively. A thermodilution system was also used to properly sample and dilute the mainstream aerosol. Different types of electronic cigarettes, liquid flavors, liquid nicotine contents, as well as different puffing times were tested. Conventional tobacco cigarettes were also investigated. The total particle number concentration peak (for 2-s puff), averaged across the different electronic cigarette types and liquids, was measured equal to 4.39 ± 0.42 × 109 part. cm−3, then comparable to the conventional cigarette one (3.14 ± 0.61 × 109 part. cm−3). Puffing times and nicotine contents were found to influence the particle concentration, whereas no significant differences were recognized in terms of flavors and types of cigarettes used. Particle number distribution modes of the electronic cigarette-generated aerosol were in the 120–165 nm range, then similar to the conventional cigarette one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During their entire lives, people are exposed to the pollutants present in indoor air. Recently, Electronic Nicotine Delivery Systems, mainly known as electronic cigarettes, have been widely commercialized: they deliver particles into the lungs of the users but a “second-hand smoke” has yet to be associated to this indoor source. On the other hand, the naturally-occurring radioactive gas, i.e. radon, represents a significant risk for lung cancer, and the cumulative action of these two agents could be worse than the agents separately would. In order to deepen the interaction between radon progeny and second-hand aerosol from different types of cigarettes, a designed experimental study was carried out by generating aerosol from e-cigarette vaping as well as from second-hand traditional smoke inside a walk-in radon chamber at the National Institute of Ionizing Radiation Metrology (INMRI) of Italy. In this chamber, the radon present in air comes naturally from the floor and ambient conditions are controlled. To characterize the sidestream smoke emitted by cigarettes, condensation particle counters and scanning mobility particle sizer were used. Radon concentration in the air was measured through an Alphaguard ionization chamber, whereas the measurement of radon decay product in the air was performed with the Tracelab BWLM Plus-2S Radon daughter Monitor. It was found an increase of the Potential Alpha-Energy Concentration (PAEC) due to the radon decay products attached to aerosol for higher particle number concentrations. This varied from 7.47 ± 0.34 MeV L−1 to 12.6 ± 0.26 MeV L−1 (69%) for the e-cigarette. In the case of traditional cigarette and at the same radon concentration, the increase was from 14.1 ± 0.43 MeV L−1 to 18.6 ± 0.19 MeV L−1 (31%). The equilibrium factor increases, varying from 23.4% ± 1.11% to 29.5% ± 0.26% and from 30.9% ± 1.0% to 38.1 ± 0.88 for the e-cigarette and traditional cigarette, respectively. These growths still continue for long time after the combustion, by increasing the exposure risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores novel driving experiences that make use of gamification and augmented reality in the car. We discuss our design considerations, which are grounded in road safety psychology and video game design theory. We aim to address the tension between safe driving practices and player engagement. Specifically, we propose a holistic, iterative thinking process inspired by game design cognition and share our insights generated through the application of this process. We present preliminary game concepts that blend digital components with physical elements from the driving environment. We further highlight how this design process helped us to iteratively evolve these concepts towards being safer while maintaining fun. These insights and game design cognition itself will be useful to the AutomotiveUI community investigating similar novel driving experiences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gac fruits were physically measured and stored under ambient conditions for up to 2 weeks to observe changes in carotenoid contents (lycopene and beta carotene) in its aril. Initial concentrations in the aril of lycopene were from 2.378 mg/g fresh weight (FW) to 3.728 mg/g FW and those of beta carotene were from 0.257 to 0.379 mg/g FW. Carotenoid concentrations in the aril remained stable after 1 week but sharply declined after 2 weeks of storage. Gac oil, pressed from gac aril, has similar concentrations of lycopene and beta carotene (2.436 and 2.592 mg/g, respectively). Oil was treated with 0.02% of butylated hydroxytoluene, or with a stream of nitrogen or untreated then stored in the dark for up to 15 or 19 weeks under different temperatures (5 °C, ambient, 45 and 60 °C). Lycopene and beta carotene in control gac oil degraded following the first-order kinetic model. The degradation rate of lycopene and beta carotene in the treated oil samples were lower than that in the control oil but the first-order kinetic was not always followed. However, both lycopene and beta carotene degraded quickly in gac oil with the first-order kinetic under high temperature conditions (45 and 60 °C) regardless of the treatments used. © 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the effects of experience on the intuitiveness of physical and visual interactions performed by airport security screeners. Using portable eye tracking glasses, 40 security screeners were observed in the field as they performed search, examination and interface interactions during airport security x-ray screening. Data from semi structured interviews was used to further explore the nature of visual and physical interactions. Results show there are positive relationships between experience and the intuitiveness of visual and physical interactions performed by security screeners. As experience is gained, security screeners are found to perform search, examination and interface interactions more intuitively. In addition to experience, results suggest that intuitiveness is affected by the nature and modality of activities performed. This inference was made based on the dominant processing styles associated with search and examination activities. The paper concludes by discussing the implications that this research has for the design of visual and physical interfaces. We recommend designing interfaces that build on users’ already established intuitive processes, and that reduce the cognitive load incurred during transitions between visual and physical interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene expression is arguably the most important indicator of biological function. Thus identifying differentially expressed genes is one of the main aims of high throughout studies that use microarray and RNAseq platforms to study deregulated cellular pathways. There are many tools for analysing differentia gene expression from transciptomic datasets. The major challenge of this topic is to estimate gene expression variance due to the high amount of ‘background noise’ that is generated from biological equipment and the lack of biological replicates. Bayesian inference has been widely used in the bioinformatics field. In this work, we reveal that the prior knowledge employed in the Bayesian framework also helps to improve the accuracy of differential gene expression analysis when using a small number of replicates. We have developed a differential analysis tool that uses Bayesian estimation of the variance of gene expression for use with small numbers of biological replicates. Our method is more consistent when compared to the widely used cyber-t tool that successfully introduced the Bayesian framework to differential analysis. We also provide a user-friendly web based Graphic User Interface for biologists to use with microarray and RNAseq data. Bayesian inference can compensate for the instability of variance caused when using a small number of biological replicates by using pseudo replicates as prior knowledge. We also show that our new strategy to select pseudo replicates will improve the performance of the analysis. - See more at: http://www.eurekaselect.com/node/138761/article#sthash.VeK9xl5k.dpuf

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many websites presently provide the facility for users to rate items quality based on user opinion. These ratings are used later to produce item reputation scores. The majority of websites apply the mean method to aggregate user ratings. This method is very simple and is not considered as an accurate aggregator. Many methods have been proposed to make aggregators produce more accurate reputation scores. In the majority of proposed methods the authors use extra information about the rating providers or about the context (e.g. time) in which the rating was given. However, this information is not available all the time. In such cases these methods produce reputation scores using the mean method or other alternative simple methods. In this paper, we propose a novel reputation model that generates more accurate item reputation scores based on collected ratings only. Our proposed model embeds statistical data, previously disregarded, of a given rating dataset in order to enhance the accuracy of the generated reputation scores. In more detail, we use the Beta distribution to produce weights for ratings and aggregate ratings using the weighted mean method. Experiments show that the proposed model exhibits performance superior to that of current state-of-the-art models.