138 resultados para real time traffic information
Resumo:
With an increased emphasis on genotyping of single nucleotide polymorphisms (SNPs) in disease association studies, the genotyping platform of choice is constantly evolving. In addition, the development of more specific SNP assays and appropriate genotype validation applications is becoming increasingly critical to elucidate ambiguous genotypes. In this study, we have used SNP specific Locked Nucleic Acid (LNA) hybridization probes on a real-time PCR platform to genotype an association cohort and propose three criteria to address ambiguous genotypes. Based on the kinetic properties of PCR amplification, the three criteria address PCR amplification efficiency, the net fluorescent difference between maximal and minimal fluorescent signals and the beginning of the exponential growth phase of the reaction. Initially observed SNP allelic discrimination curves were confirmed by DNA sequencing (n = 50) and application of our three genotype criteria corroborated both sequencing and observed real-time PCR results. In addition, the tested Caucasian association cohort was in Hardy-Weinberg equilibrium and observed allele frequencies were very similar to two independently tested Caucasian association cohorts for the same tested SNP. We present here a novel approach to effectively determine ambiguous genotypes generated from a real-time PCR platform. Application of our three novel criteria provides an easy to use semi-automated genotype confirmation protocol.
Resumo:
We advocate for the use of predictive techniques in interactive computer music systems. We suggest that the inclusion of prediction can assist in the design of proactive rather than reactive computational performance partners. We summarize the significant role prediction plays in human musical decisions, and the only modest use of prediction in interactive music systems to date. After describing how we are working toward employing predictive processes in our own metacreation software we reflect on future extensions to these approaches.
Resumo:
This paper presents an investigation into event detection in crowded scenes, where the event of interest co-occurs with other activities and only binary labels at the clip level are available. The proposed approach incorporates a fast feature descriptor from the MPEG domain, and a novel multiple instance learning (MIL) algorithm using sparse approximation and random sensing. MPEG motion vectors are used to build particle trajectories that represent the motion of objects in uniform video clips, and the MPEG DCT coefficients are used to compute a foreground map to remove background particles. Trajectories are transformed into the Fourier domain, and the Fourier representations are quantized into visual words using the K-Means algorithm. The proposed MIL algorithm models the scene as a linear combination of independent events, where each event is a distribution of visual words. Experimental results show that the proposed approaches achieve promising results for event detection compared to the state-of-the-art.
Resumo:
Price based technique is one way to handle increase in peak demand and deal with voltage violations in residential distribution systems. This paper proposes an improved real time pricing scheme for residential customers with demand response option. Smart meters and in-home display units are used to broadcast the price and appropriate load adjustment signals. Customers are given an opportunity to respond to the signals and adjust the loads. This scheme helps distribution companies to deal with overloading problems and voltage issues in a more efficient way. Also, variations in wholesale electricity prices are passed on to electricity customers to take collective measure to reduce network peak demand. It is ensured that both customers and utility are benefitted by this scheme.
Resumo:
This paper describes the theory and practice for a stable haptic teleoperation of a flying vehicle. It extends passivity-based control framework for haptic teleoperation of aerial vehicles in the longest intercontinental setting that presents great challenges. The practicality of the control architecture has been shown in maneuvering and obstacle-avoidance tasks over the internet with the presence of significant time-varying delays and packet losses. Experimental results are presented for teleoperation of a slave quadrotor in Australia from a master station in the Netherlands. The results show that the remote operator is able to safely maneuver the flying vehicle through a structure using haptic feedback of the state of the slave and the perceived obstacles.
Resumo:
The ability to measure surface temperature and represent it on a metrically accurate 3D model has proven applications in many areas such as medical imaging, building energy auditing, and search and rescue. A system is proposed that enables this task to be performed with a handheld sensor, and for the first time with results able to be visualized and analyzed in real-time. A device comprising a thermal-infrared camera and range sensor is calibrated geometrically and used for data capture. The device is localized using a combination of ICP and video-based pose estimation from the thermal-infrared video footage which is shown to reduce the occurrence of failure modes. Furthermore, the problem of misregistration which can introduce severe distortions in assigned surface temperatures is avoided through the use of a risk-averse neighborhood weighting mechanism. Results demonstrate that the system is more stable and accurate than previous approaches, and can be used to accurately model complex objects and environments for practical tasks.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Wireless networked control systems (WNCSs) have been increasingly deployed in industrial applications. As they require timely data packet transmissions, it is difficult to make efficient use of the limited channel resources, particularly in contention based wireless networks in the layered network architecture. Aiming to maintain the WNCSs under critical real-time traffic condition at which the WNCSs marginally meet the real-time requirements, a cross-layer design (CLD) approach is presented in this paper to adaptively adjust the control period to achieve improved channel utilization while still maintaining effective and timely packet transmissions. The effectiveness of the proposed approach is demonstrated through simulation studies.
Resumo:
Real-time image analysis and classification onboard robotic marine vehicles, such as AUVs, is a key step in the realisation of adaptive mission planning for large-scale habitat mapping in previously unexplored environments. This paper describes a novel technique to train, process, and classify images collected onboard an AUV used in relatively shallow waters with poor visibility and non-uniform lighting. The approach utilises Förstner feature detectors and Laws texture energy masks for image characterisation, and a bag of words approach for feature recognition. To improve classification performance we propose a usefulness gain to learn the importance of each histogram component for each class. Experimental results illustrate the performance of the system in characterisation of a variety of marine habitats and its ability to operate onboard an AUV's main processor suitable for real-time mission planning.
Resumo:
Orthotopic or intracardiac injection of human breast cancer cell lines into immunocompromised mice allows study of the molecular basis of breast cancer metastasis. We have established a quantitative real-time PCR approach to analyze metastatic spread of human breast cancer cells inoculated into nude mice via these routes. We employed MDA-MB-231 human breast cancer cells genetically tagged with a bacterial β-galactosidase (Lac-Z) retroviral vector, enabling their detection by TaqMan® real-time PCR. PCR detection was linear, specific, more sensitive than conventional PCR, and could be used to directly quantitate metastatic burden in bone and soft organs. Attesting to the sensitivity and specificity of the PCR detection strategy, as few as several hundred metastatic MDA-MB-231 cells were detectable in 100 μm segments of paraffin-embedded lung tissue, and only in samples adjacent to sections that scored positive by histological detection. Moreover, the measured real-time PCR metastatic burden in the bone environment (mouse hind-limbs, n = 48) displayed a high correlation to the degree of osteolytic damage observed by high resolution X-ray analysis (r2 = 0.972). Such a direct linear relationship to tumor burden and bone damage substantiates the so-called 'vicious cycle' hypothesis in which metastatic tumor cells promote the release of factors from the bone which continue to stimulate the tumor cells. The technique provides a useful tool for molecular and cellular analysis of human breast cancer metastasis to bone and soft organs, can easily be extended to other cell/marker/organ systems, and should also find application in preclinical assessment of anti-metastatic modalities.
Resumo:
Combining human-computer interaction and urban informatics, this design research developed and tested novel interfaces offering users real-time feedback on their paper and energy consumption. Findings from deploying these interfaces in both domestic and office environments in Australia, the UK, and Ireland, will innovate future generations of resource monitoring technologies. The study draws conclusions with implications for government policy, the energy industry, and sustainability researchers.
Resumo:
Tunable synthesis of bimetallic AuxAg1-x alloyed nanoparticles and in situ monitoring of their plasmonic responses is presented. This is a new conceptual approach based on green and energy efficient, reactive, and highly-non-equilibrium microplasma chemistry.
Resumo:
Quantum cascade laserabsorption spectroscopy was used to measure the absolute concentration of acetylene in situ during the nanoparticle growth in Ar + C2H2 RF plasmas. It is demonstrated that the nanoparticle growth exhibits a periodical behavior, with the growth cycle period strongly dependent on the initial acetylene concentration in the chamber. Being 300 s at 7.5% of acetylene in the gas mixture, the growth cycle period decreases with the acetylene concentration increasing; the growth eventually disappears when the acetylene concentration exceeds 32%. During the nanoparticle growth, the acetylene concentration is small and does not exceed 4.2% at radio frequency (RF) power of 4 W, and 0.5% at RF power of 20 W. An injection of a single acetylene pulse into the discharge also results in the nanoparticlenucleation and growth. The absorption spectroscopy technique was found to be very effective for the time-resolved measurement of the hydrocarbon content in nanoparticle-generatingplasmas.
Resumo:
Polymorphisms of glutathione transferases (GST) are important genetic determinants of susceptibility to environmental carcinogens (Rebbeck, 1997). The GSTs are a multigene family of dimeric enzymes involved in detoxification, and, in a few cases, the bioactivation of a variety of xenobiotics (Hayes et al., 1995). The cytosolic GST enzyme family consists of four major classes of enzymes, referred to as alpha, mu, pi and theta. Several members of this family (for example, GSTM1, GSTT1 and GSTP1) are polymorphic in human populations (Wormhoudt et al., 1999). Molecular epidemiology studies have examined the role of GST polymorphisms as susceptibility factors for environmentally and/or occupationally induced cancers (Wormhoudt et al., 1999). In particular, case-control studies showed a relationship between the GSTM1 null genotype and the development of cancer in association with smoking habits, which has been shown for cancers of the respiratory and gastrointestinal tracts as well as other cancer types (Miller et al., 1997). Only a few molecular epidemiological studies addressed the role of GSTT1 and GSTP1 polymorphisms in cancer susceptibility. Since GSTP1 is a key player in biotransformation/bioactivation of benzo(a)pyrene, GSTP1 may be even more important than GSTM1 in the prevention of tobacco-induced cancers (Harries et al., 1997; Harris et al., 1998). To date, this relationship has not been sufficiently addressed in humans. Comprehensive molecular epidemiological studies may add to the current knowledge of the role of GST polymorphisms in cancer susceptibility and extent of the knowledge gained from approaches that used phenotyping, such as GSTM1 activity as it relates to trans-stilbene oxide, or polymerase chain reaction (PCR) based genotyping of polymorphic isoenzymes (Bell et al., 1993; Pemble et al., 1994; Harries et al., 1997).