272 resultados para short-range ordering
Resumo:
PRESENTED by the Escapists, Boy Girl Wall tells the unlikely yet strangely inevitable story of the series of events by which an odd assortment of people, objects and chance occurrences conspire to bring together lonely neighbours Thomas and Alethea...
Resumo:
SET on a sparse stage with a ladder, a table, a few chairs and a backdrop of plastic sheeting, Hamlet Apocalypse retails the core of Shakespeare's story in combination with the actor's relation to the concept of the end of everything.
Resumo:
Electronic Health Record (EHR) retrieval processes are complex demanding Information Technology (IT) resources exponentially in particular memory usage. Database-as-a-service (DAS) model approach is proposed to meet the scalability factor of EHR retrieval processes. A simulation study using ranged of EHR records with DAS model was presented. The bucket-indexing model incorporated partitioning fields and bloom filters in a Singleton design pattern were used to implement custom database encryption system. It effectively provided faster responses in the range query compared to different types of queries used such as aggregation queries among the DAS, built-in encryption and the plain-text DBMS. The study also presented with constraints around the approach should consider for other practical applications.
Resumo:
U-Healthcare means that it provides healthcare services "at anytime and anywhere" using wired, wireless and ubiquitous sensor network technologies. As a main field of U-healthcare, Telehealth has been developed as an enhancement of Telemedicine. This system includes two-way interactive web-video communications, sensor technology, and health informatics. With these components, it will assist patients to receive their first initial diagnosis. Futhermore, Telehealth will help doctors diagnose patient's diseases at early stages and recommend treatments to patients. However, this system has a few limitations such as privacy issues, interruption of real-time service and a wrong ordering from remote diagnosis. To deal with those flaws, security procedures such as authorised access should be applied to as an indispensible component in medical environment. As a consequence, Telehealth system with these protection procedures in clinical services will cope with anticipated vulnerabilities of U-Healthcare services and security issues involved.
Time dependency of molecular rate estimates and systematic overestimation of recent divergence times
Resumo:
Studies of molecular evolutionary rates have yielded a wide range of rate estimates for various genes and taxa. Recent studies based on population-level and pedigree data have produced remarkably high estimates of mutation rate, which strongly contrast with substitution rates inferred in phylogenetic (species-level) studies. Using Bayesian analysis with a relaxed-clock model, we estimated rates for three groups of mitochondrial data: avian protein-coding genes, primate protein-coding genes, and primate d-loop sequences. In all three cases, we found a measurable transition between the high, short-term (<1–2 Myr) mutation rate and the low, long-term substitution rate. The relationship between the age of the calibration and the rate of change can be described by a vertically translated exponential decay curve, which may be used for correcting molecular date estimates. The phylogenetic substitution rates in mitochondria are approximately 0.5% per million years for avian protein-coding sequences and 1.5% per million years for primate protein-coding and d-loop sequences. Further analyses showed that purifying selection offers the most convincing explanation for the observed relationship between the estimated rate and the depth of the calibration. We rule out the possibility that it is a spurious result arising from sequence errors, and find it unlikely that the apparent decline in rates over time is caused by mutational saturation. Using a rate curve estimated from the d-loop data, several dates for last common ancestors were calculated: modern humans and Neandertals (354 ka; 222–705 ka), Neandertals (108 ka; 70–156 ka), and modern humans (76 ka; 47–110 ka). If the rate curve for a particular taxonomic group can be accurately estimated, it can be a useful tool for correcting divergence date estimates by taking the rate decay into account. Our results show that it is invalid to extrapolate molecular rates of change across different evolutionary timescales, which has important consequences for studies of populations, domestication, conservation genetics, and human evolution.
Resumo:
Long-term changes in the genetic composition of a population occur by the fixation of new mutations, a process known as substitution. The rate at which mutations arise in a population and the rate at which they are fixed are expected to be equal under neutral conditions (Kimura, 1968). Between the appearance of a new mutation and its eventual fate of fixation or loss, there will be a period in which it exists as a transient polymorphism in the population (Kimura and Ohta, 1971). If the majority of mutations are deleterious (and nonlethal), the fixation probabilities of these transient polymorphisms are reduced and the mutation rate will exceed the substitution rate (Kimura, 1983). Consequently, different apparent rates may be observed on different time scales of the molecular evolutionary process (Penny, 2005; Penny and Holmes, 2001). The substitution rate of the mitochondrial protein-coding genes of birds and mammals has been traditionally recognized to be about 0.01 substitutions/site/million years (Myr) (Brown et al., 1979; Ho, 2007; Irwin et al., 1991; Shields and Wilson, 1987), with the noncoding D-loop evolving several times more quickly (e.g., Pesole et al., 1992; Quinn, 1992). Over the past decade, there has been mounting evidence that instantaneous mutation rates substantially exceed substitution rates, in a range of organisms (e.g., Denver et al., 2000; Howell et al., 2003; Lambert et al., 2002; Mao et al., 2006; Mumm et al., 1997; Parsons et al., 1997; Santos et al., 2005). The immediate reaction to the first of these findings was that the polymorphisms generated by the elevated mutation rate are short-lived, perhaps extending back only a few hundred years (Gibbons, 1998; Macaulay et al., 1997). That is, purifying selection was thought to remove these polymorphisms very rapidly.
Resumo:
Effective planning is an important part of meeting the learning needs of students in middle years classrooms. Yet this is a task with inherent complexity. Teachers who subscribe to a contemporary philosophy of middle years education often come together to cooperatively plan and teach a major project. These teachers often represent a range of teaching areas and between them they work with groups of students. Working in this way requires teachers to juggle the demands of curriculum, pedagogy and assessment, while considering short and long term goals and objectives, the diverse needs of students, and relevant contextual factors. Because all teachers, regardless of their specialist teaching area, are teachers of literacy, expertise in planning for content learning as well as literacy learning is essential.
Resumo:
Local governments struggle to engage time poor and seemingly apathetic citizens, as well as the city's young digital natives, the digital locals. Capturing the attention of this digitally literate community who are technology and socially savvy adds a new quality to the challenge of community engagement for urban planning. This project developed and tested a lightweight design intervention towards removing the hierarchy between those who plan the city and those who use it. The aim is to narrow this gap by enhancing people's experience of physical spaces with digital, civic technologies that are directly accessible within that space. The study's research informed the development of a public screen system called Discussions In Space (DIS). It facilitates a feedback platform about specific topics, e.g., a concrete urban planning project, and encourages direct, in-situ, real-time user responses via SMS and Twitter. The thesis presents the findings of deploying and integrating DIS in a wide range of public and urban environments, including the iconic urban screen at Federation Square in Melbourne, to explore the Human-Computer Interaction (HCI) related challenges and implications. It was also deployed in conjunction with a major urban planning project in Brisbane to explore the system's opportunities and challenges of better engaging with Australia's new digital locals. Finally, the merits of the short-texted and ephemeral data generated by the system were evaluated in three focus groups with professional urban planners. DIS offers additional benefits for civic participation as it gives voice to residents who otherwise would not be easily heard. It also promotes a positive attitude towards local governments and gathers complementary information that is different than that captured by more traditional public engagement tools.
Resumo:
This paper presents a survey of previously presented vision based aircraft detection flight test, and then presents new flight test results examining the impact of camera field-of view choice on the detection range and false alarm rate characteristics of a vision-based aircraft detection technique. Using data collected from approaching aircraft, we examine the impact of camera fieldof-view choice and confirm that, when aiming for similar levels of detection confidence, an improvement in detection range can be obtained by choosing a smaller effective field-of-view (in terms of degrees per pixel).
Resumo:
Modelling activities in crowded scenes is very challenging as object tracking is not robust in complicated scenes and optical flow does not capture long range motion. We propose a novel approach to analyse activities in crowded scenes using a “bag of particle trajectories”. Particle trajectories are extracted from foreground regions within short video clips using particle video, which estimates long range motion in contrast to optical flow which is only concerned with inter-frame motion. Our applications include temporal video segmentation and anomaly detection, and we perform our evaluation on several real-world datasets containing complicated scenes. We show that our approaches achieve state-of-the-art performance for both tasks.
Resumo:
This paper investigates the effects of limited speech data in the context of speaker verification using a probabilistic linear discriminant analysis (PLDA) approach. Being able to reduce the length of required speech data is important to the development of automatic speaker verification system in real world applications. When sufficient speech is available, previous research has shown that heavy-tailed PLDA (HTPLDA) modeling of speakers in the i-vector space provides state-of-the-art performance, however, the robustness of HTPLDA to the limited speech resources in development, enrolment and verification is an important issue that has not yet been investigated. In this paper, we analyze the speaker verification performance with regards to the duration of utterances used for both speaker evaluation (enrolment and verification) and score normalization and PLDA modeling during development. Two different approaches to total-variability representation are analyzed within the PLDA approach to show improved performance in short-utterance mismatched evaluation conditions and conditions for which insufficient speech resources are available for adequate system development. The results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset suggest that the HTPLDA system can continue to achieve better performance than Gaussian PLDA (GPLDA) as evaluation utterance lengths are decreased. We also highlight the importance of matching durations for score normalization and PLDA modeling to the expected evaluation conditions. Finally, we found that a pooled total-variability approach to PLDA modeling can achieve better performance than the traditional concatenated total-variability approach for short utterances in mismatched evaluation conditions and conditions for which insufficient speech resources are available for adequate system development.
Resumo:
An elevated particle number concentration (PNC) observed during nucleation events could play a significant contribution to the total particle load and therefore to the air pollution in the urban environments. Therefore, a field measurement study of PNC was commenced to investigate the temporal and spatial variations of PNC within the urban airshed of Brisbane, Australia. PNC was monitored at urban (QUT), roadside (WOO) and semi-urban (ROC) areas around the Brisbane region during 2009. During the morning traffic peak period, the highest relative fraction of PNC reached about 5% at QUT and WOO on weekdays. PNC peaks were observed around noon, which correlated with the highest solar radiation levels at all three stations, thus suggesting that high PNC levels were likely to be associated with new particle formation caused by photochemical reactions. Wind rose plots showed relatively higher PNC for the NE direction, which was associated with industrial pollution, accounting for 12%, 9% and 14% of overall PNC at QUT, WOO and ROC, respectively. Although there was no significant correlation between PNC at each station, the variation of PNC was well correlated among three stations during regional nucleation events. In addition, PNC at ROC was significantly influenced by upwind urban pollution during the nucleation burst events, with the average enrichment factor of 15.4. This study provides an insight into the influence of regional nucleation events on PNC in the Brisbane region and it the first study to quantify the effect of urban pollution on semi-urban PNC through the nucleation events. © 2012 Author(s).