951 resultados para error-location number
Resumo:
Purpose: There are some limited reports, based on questionnaire data, which suggest that outdoor activity decreases the risk of myopia in children and may offset the myopia risk associated with prolonged near work. The aim of this study was to explore the relationship between near work, indoor illumination, daily sunlight and ultraviolet (UV) exposure in emmetropic and myopic University students, given that University students perform significant amounts of near work and as a group have a high prevalence of myopia. Methods: Participants were 35 students, aged 17 to 25 years who were classified as being emmetropic (n=13), or having stable (n=12) or progressing myopia (n=10). During waking hours on three separate days participants wore a light sensor data logger (HOBO) and a polysulphone UV dosimeter; these devices measured daily illuminance and accumulative UV exposure respectively; participants also completed a daily activity log. Results: No significant between group differences were observed for average daily illuminance (p=0.732), number of hours per day spent in sunlight (p=0.266), outdoor shade (p=0.726), bright indoor/dim outdoor light (p=0.574) or dim room illumination (p=0.484). Daily UV exposure was significantly different across the groups (p=0.003); with stable myopes experiencing the greatest UV exposure (versus emmetropes p=0.002; versus progressing myopes p=0.004). Conclusions: The current literature suggests there is a link between myopia protection and spending time outdoors in children. Our data provides some evidence of this relationship in young adults and highlights the need for larger studies to further investigate this relationship longitudinally.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, as the gathered information is from the crowd, the data quality is always hard to manage. There are many ways to manage data quality, and reputation management is one of the common approaches. In recent year, many research teams have deployed many audio or image sensors in natural environment in order to monitor the status of animals or plants. The collected data will be analysed by ecologists. However, as the amount of collected data is exceedingly huge and the number of ecologists is very limited, it is impossible for scientists to manually analyse all these data. The functions of existing automated tools to process the data are still very limited and the results are still not very accurate. Therefore, researchers have turned to recruiting general citizens who are interested in helping scientific research to do the pre-processing tasks such as species tagging. Although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Therefore, this research aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we aim to investigate how to use reputation management to enhance data reliability. Reputation systems have been used to solve the uncertainty and improve data quality in many marketing and E-Commerce domains. The commercial organizations which have chosen to embrace the reputation management and implement the technology have gained many benefits. Data quality issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. However, research on reputation management in this area is relatively new. We therefore start our investigation by examining existing reputation systems in different domains. Then we design novel reputation management approaches for Citizen Science projects to categorise participants and data. We have investigated some critical elements which may influence data reliability in Citizen Science projects. These elements include personal information such as location and education and performance information such as the ability to recognise certain bird calls. The designed reputation framework is evaluated by a series of experiments involving many participants for collecting and interpreting data, in particular, environmental acoustic data. Our research in exploring the advantages of reputation management in Citizen Science (or crowdsourcing in general) will help increase awareness among organizations that are unacquainted with its potential benefits.
Resumo:
Motor unit number estimation (MUNE) is a method which aims to provide a quantitative indicator of progression of diseases that lead to loss of motor units, such as motor neurone disease. However the development of a reliable, repeatable and fast real-time MUNE method has proved elusive hitherto. Ridall et al. (2007) implement a reversible jump Markov chain Monte Carlo (RJMCMC) algorithm to produce a posterior distribution for the number of motor units using a Bayesian hierarchical model that takes into account biological information about motor unit activation. However we find that the approach can be unreliable for some datasets since it can suffer from poor cross-dimensional mixing. Here we focus on improved inference by marginalising over latent variables to create the likelihood. In particular we explore how this can improve the RJMCMC mixing and investigate alternative approaches that utilise the likelihood (e.g. DIC (Spiegelhalter et al., 2002)). For this model the marginalisation is over latent variables which, for a larger number of motor units, is an intractable summation over all combinations of a set of latent binary variables whose joint sample space increases exponentially with the number of motor units. We provide a tractable and accurate approximation for this quantity and also investigate simulation approaches incorporated into RJMCMC using results of Andrieu and Roberts (2009).
Resumo:
Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.
Resumo:
In this paper, we review the sequential slotted amplify-decode-and-forward (SADF) protocol with half-duplex single-antenna and evaluate its performance in terms of pairwise error probability (PEP). We obtain the PEP upper bound of the protocol and find out that the achievable diversity order of the protocol is two with arbitrary number of relay terminals. To achieve the maximum achievable diversity order, we propose a simple precoder that is easy to implement with any number of relay terminals and transmission slots. Simulation results show that the proposed precoder achieves the maximum achievable diversity order and has similar BER performance compared to some of the existing precoders.
Resumo:
The use of Bayesian methodologies for solving optimal experimental design problems has increased. Many of these methods have been found to be computationally intensive for design problems that require a large number of design points. A simulation-based approach that can be used to solve optimal design problems in which one is interested in finding a large number of (near) optimal design points for a small number of design variables is presented. The approach involves the use of lower dimensional parameterisations that consist of a few design variables, which generate multiple design points. Using this approach, one simply has to search over a few design variables, rather than searching over a large number of optimal design points, thus providing substantial computational savings. The methodologies are demonstrated on four applications, including the selection of sampling times for pharmacokinetic and heat transfer studies, and involve nonlinear models. Several Bayesian design criteria are also compared and contrasted, as well as several different lower dimensional parameterisation schemes for generating the many design points.
Resumo:
Food has been a major agenda in political, socio-cultural, and environmental domains throughout history. The significance of food has been particularly highlighted in recent years with the growing public awareness of the unfolding impacts of climate change, challenging our understanding, practice, and expectations of our relationship with food. Parallel to this development has been the rise of web applications such as blogs, wikis, video and photo sharing sites, and social networking systems that are arguably more open, collaborative, and personalisable. These so-called ‘Web 2.0’ technologies have contributed to a more participatory Internet experience than what had previously been possible. An increasing number of these social applications are now available on mobile technologies where they take advantage of device-specific features such as sensors, location and context awareness, further expanding potential for the culture of participation and creativity. This international volume assembles a diverse collection of book chapters that contribute towards exploring and better understanding the opportunities and challenges provided by tools, interfaces, methods, and practices of social and mobile technology to enable engagement with people and creativity in the domain of food in contemporary society. It brings together an international group of academics and practitioners from a diverse range of disciplines such as computing and engineering, social sciences, digital media and human-computer interaction to critically examine a range of applications of social and mobile technology, such as social networking, mobile interaction, wikis, twitter, blogging, mapping, shared displays and urban screens, and their impact to foster a better understanding and practice of environmentally, socio-culturally, economically, and health-wise sustainable food culture.
Resumo:
Over the past 30 years the nature of airport precincts has changed significantly from purely aviation services to a full range of retail, commercial, industrial and other non aviation uses. Most major airports in Australia are owned and operated by the private sector but are subject to long term head leases to the Federal Government, with subsequent sub leases in place to users of the land. The lease term available for both aviation and non aviation tenants is subject to the head lease term and in a number of Australian airport locations, these head leases are now two-thirds through their initial 50 year lease term and this is raising a number of issues from a valuation and ongoing development perspective. . For our airport precincts to continue to offer levels of infrastructure and services that are comparable or better than many commercial centres in the same location, policy makers need to understand the impact the uncertainty that exists when the current lease term is nearing expiration, especially in relation to the renewed lease term and rental payments. This paper reviews the changes in airport precinct ownership, management and development in Australia and highlights the valuation and rental assessment issues that are currently facing this property sector.
Thinking about Australia and its location in the modern world in the Australian Curriculum : history
Resumo:
The first national history curriculum is being implemented in Australia from 2013. As with the curriculums of other nations, this curriculum has evolved in response to a range of factors and its merits continue to be debated. In critiquing the sort of history education approach encapsulated in the new curriculum, I discuss some of the contextual factors and debates that have shaped the Australian Curriculum: History v0.3 (ACARA, 2012). In doing so, I also explore some of the recent international literature on how students think and learn about history in the classroom. In the third and final part of the paper, I raise some logistical issues and also question how students might engage with the notion of Australia as a nation in the modern world rapidly reshaped by the transformations occurring in Asia and share some concerns about the curriculum’s ‘world history approach’ for Year 10.
Resumo:
Readily accepted knowledge regarding crash causation is consistently omitted from efforts to model and subsequently understand motor vehicle crash occurrence and their contributing factors. For instance, distracted and impaired driving accounts for a significant proportion of crash occurrence, yet is rarely modeled explicitly. In addition, spatially allocated influences such as local law enforcement efforts, proximity to bars and schools, and roadside chronic distractions (advertising, pedestrians, etc.) play a role in contributing to crash occurrence and yet are routinely absent from crash models. By and large, these well-established omitted effects are simply assumed to contribute to model error, with predominant focus on modeling the engineering and operational effects of transportation facilities (e.g. AADT, number of lanes, speed limits, width of lanes, etc.) The typical analytical approach—with a variety of statistical enhancements—has been to model crashes that occur at system locations as negative binomial (NB) distributed events that arise from a singular, underlying crash generating process. These models and their statistical kin dominate the literature; however, it is argued in this paper that these models fail to capture the underlying complexity of motor vehicle crash causes, and thus thwart deeper insights regarding crash causation and prevention. This paper first describes hypothetical scenarios that collectively illustrate why current models mislead highway safety researchers and engineers. It is argued that current model shortcomings are significant, and will lead to poor decision-making. Exploiting our current state of knowledge of crash causation, crash counts are postulated to arise from three processes: observed network features, unobserved spatial effects, and ‘apparent’ random influences that reflect largely behavioral influences of drivers. It is argued; furthermore, that these three processes in theory can be modeled separately to gain deeper insight into crash causes, and that the model represents a more realistic depiction of reality than the state of practice NB regression. An admittedly imperfect empirical model that mixes three independent crash occurrence processes is shown to outperform the classical NB model. The questioning of current modeling assumptions and implications of the latent mixture model to current practice are the most important contributions of this paper, with an initial but rather vulnerable attempt to model the latent mixtures as a secondary contribution.
Resumo:
In this study, natural convection heat transfer and buoyancy driven flows have been investigated in a right angled triangular enclosure. The heater located on the bottom wall while the inclined wall is colder and the remaining walls are maintained as adiabatic. Governing equations of natural convection are solved through the finite volume approach, in which buoyancy is modeled via the Boussinesq approximation. Effects of different parameters such as Rayleigh number, aspect ratio, prantdl number and heater location are considered. Results show that heat transfer increases when the heater is moved toward the right corner of the enclosure. It is also revealed that increasing the Rayleigh number, increases the strength of free convection regime and consequently increases the value of heat transfer rate. Moreover, larger aspect ratio enclosure has larger Nusselt number value. In order to have better insight, streamline and isotherms are shown.
Resumo:
Technical: This looped video work is made up from a number of still images animated in a sequence, not intended to be a smooth action, but syncopated. The gaps in real time suggest blinking, lapses in technology, warps in time and distance. The still images have been manipulated in photoshop and imported into an animation software program, showing subtle emphasis on various aspects of the features of the face and location from the screen shots. Content: Obsession is one of the most constant expressions of love, whether it is a negative attribute, such as stalking, or the need to see or stroke the beloved, or telling and re-telling the story of first meeting. “like the beat, beat, beat of the tom tom when the jungle shadows fall like the tick, tick, tock of the stately clock as it stands against the wall like the drip, drip drip of the rain drops when the summer shower’s through a voice within me keeps repeating you, you, you… only you...” (Cole Porter, Night and Day) My desire to immerse myself in my newly-met adult daughter as a reality is as obsessive as any new parent. The primary means of contact is video cam, which has become a kind of surveillance for me. I stare at her mouth as it moves, her ear as she moves her head, her smile, her grimaces, her high cheekbones, her eyes that are like mine, the shape of her head, her teeth that are like her father’s… hundreds of candid pictures screen shot over a year and a half were the source of the video images.
Resumo:
Several major human pathogens, including the filoviruses, paramyxoviruses, and rhabdoviruses, package their single-stranded RNA genomes within helical nucleocapsids, which bud through the plasma membrane of the infected cell to release enveloped virions. The virions are often heterogeneous in shape, which makes it difficult to study their structure and assembly mechanisms. We have applied cryo-electron tomography and sub-tomogram averaging methods to derive structures of Marburg virus, a highly pathogenic filovirus, both after release and during assembly within infected cells. The data demonstrate the potential of cryo-electron tomography methods to derive detailed structural information for intermediate steps in biological pathways within intact cells. We describe the location and arrangement of the viral proteins within the virion. We show that the N-terminal domain of the nucleoprotein contains the minimal assembly determinants for a helical nucleocapsid with variable number of proteins per turn. Lobes protruding from alternate interfaces between each nucleoprotein are formed by the C-terminal domain of the nucleoprotein, together with viral proteins VP24 and VP35. Each nucleoprotein packages six RNA bases. The nucleocapsid interacts in an unusual, flexible "Velcro-like" manner with the viral matrix protein VP40. Determination of the structures of assembly intermediates showed that the nucleocapsid has a defined orientation during transport and budding. Together the data show striking architectural homology between the nucleocapsid helix of rhabdoviruses and filoviruses, but unexpected, fundamental differences in the mechanisms by which the nucleocapsids are then assembled together with matrix proteins and initiate membrane envelopment to release infectious virions, suggesting that the viruses have evolved different solutions to these conserved assembly steps.
Resumo:
Game jams provide design researchers with extraordinary opportunity to watch creative teams in action and recent years have seen a number of projects which seek to illuminate the design process as seen in these events. For example, Gaydos, Harris and Martinez discuss the opportunity of the jam to expose students to principles of design process and design spaces (2011). Rouse muses on the game jam ‘as radical practice’ and a ‘corrective to game creation as it is normally practiced’. His observations about his own experience in a jam emphasise the same artistic endeavour forefronted earlier, where the experience is about creation that is divorced from the instrumental motivations of commercial game design (Rouse 2011) and where the focus is on process over product. Other participants remark on the social milieu of the event as a critical factor and the collaborative opportunity as a rich site to engage participants in design processes (Shin et al, 2012). Shin et al are particularly interested in the notion of the site of the process and the ramifications of participants being in the same location. They applaud the more localized event where there is an emphasis on local participation and collaboration. For other commentators, it is specifically the social experience in the place of the jam is the most important aspect (See Keogh 2011), not the material site but rather the physical embodied experience of ‘being there’ and being part of the event. Participants talk about game jams they have attended in a similar manner to those observations made by Dourish where the experience is layered on top of the physical space of the event (Dourish 2006). It is as if the event has taken on qualities of place where we find echoes of Tuan’s description of a particular site having an aura of history that makes it a very different place, redolent and evocative (Tuan 1977). Re-presenting the experience in place has become the goal of the data visualisation project that has become the focus of our own curated 48hr game jam. Taking our cue from the work of Tim Ingold on embodied practice, we have now established the 48hr game making challenge as a site for data visualisation research in place.
Resumo:
Herein the mechanical properties of graphene, including Young’s modulus, fracture stress and fracture strain have been investigated by molecular dynamics simulations. The simulation results show that the mechanical properties of graphene are sensitive to the temperature changes but insensitive to the layer numbers in the multilayer graphene. Increasing temperature exerts adverse and significant effects on the mechanical properties of graphene. However, the adverse effect produced by the increasing layer number is marginal. On the other hand, isotope substitutions in graphene play a negligible role in modifying the mechanical properties of graphene.