635 resultados para time dependence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes an online learning control system that uses the strategy of Model Predictive Control (MPC) in a model based locally weighted learning framework. The new approach, named Locally Weighted Learning Model Predictive Control (LWL-MPC), is proposed as a solution to learn to control robotic systems with nonlinear and time varying dynamics. This paper demonstrates the capability of LWL-MPC to perform online learning while controlling the joint trajectories of a low cost, three degree of freedom elastic joint robot. The learning performance is investigated in both an initial learning phase, and when the system dynamics change due to a heavy object added to the tool point. The experiment on the real elastic joint robot is presented and LWL-MPC is shown to successfully learn to control the system with and without the object. The results highlight the capability of the learning control system to accommodate the lack of mechanical consistency and linearity in a low cost robot arm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Public transport travel time variability (PTTV) is essential for understanding deteriorations in the reliability of travel time, optimizing transit schedules and route choices. This paper establishes key definitions of PTTV in which firstly include all buses, and secondly include only a single service from a bus route. The paper then analyses the day-to-day distribution of public transport travel time by using Transit Signal Priority data. A comprehensive approach using both parametric bootstrapping Kolmogorov-Smirnov test and Bayesian Information Creation technique is developed, recommends Lognormal distribution as the best descriptor of bus travel time on urban corridors. The probability density function of Lognormal distribution is finally used for calculating probability indicators of PTTV. The findings of this study are useful for both traffic managers and statisticians for planning and researching the transit systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to measure surface temperature and represent it on a metrically accurate 3D model has proven applications in many areas such as medical imaging, building energy auditing, and search and rescue. A system is proposed that enables this task to be performed with a handheld sensor, and for the first time with results able to be visualized and analyzed in real-time. A device comprising a thermal-infrared camera and range sensor is calibrated geometrically and used for data capture. The device is localized using a combination of ICP and video-based pose estimation from the thermal-infrared video footage which is shown to reduce the occurrence of failure modes. Furthermore, the problem of misregistration which can introduce severe distortions in assigned surface temperatures is avoided through the use of a risk-averse neighborhood weighting mechanism. Results demonstrate that the system is more stable and accurate than previous approaches, and can be used to accurately model complex objects and environments for practical tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological advances have led to an influx of affordable hardware that supports sensing, computation and communication. This hardware is increasingly deployed in public and private spaces, tracking and aggregating a wealth of real-time environmental data. Although these technologies are the focus of several research areas, there is a lack of research dealing with the problem of making these capabilities accessible to everyday users. This thesis represents a first step towards developing systems that will allow users to leverage the available infrastructure and create custom tailored solutions. It explores how this notion can be utilized in the context of energy monitoring to improve conventional approaches. The project adopted a user-centered design process to inform the development of a flexible system for real-time data stream composition and visualization. This system features an extensible architecture and defines a unified API for heterogeneous data streams. Rather than displaying the data in a predetermined fashion, it makes this information available as building blocks that can be combined and shared. It is based on the insight that individual users have diverse information needs and presentation preferences. Therefore, it allows users to compose rich information displays, incorporating personally relevant data from an extensive information ecosystem. The prototype was evaluated in an exploratory study to observe its natural use in a real-world setting, gathering empirical usage statistics and conducting semi-structured interviews. The results show that a high degree of customization does not warrant sustained usage. Other factors were identified, yielding recommendations for increasing the impact on energy consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid development of the World Wide Web has created massive information leading to the information overload problem. Under this circumstance, personalization techniques have been brought out to help users in finding content which meet their personalized interests or needs out of massively increasing information. User profiling techniques have performed the core role in this research. Traditionally, most user profiling techniques create user representations in a static way. However, changes of user interests may occur with time in real world applications. In this research we develop algorithms for mining user interests by integrating time decay mechanisms into topic-based user interest profiling. Time forgetting functions will be integrated into the calculation of topic interest measurements on in-depth level. The experimental study shows that, considering temporal effects of user interests by integrating time forgetting mechanisms shows better performance of recommendation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses data from interviews with representatives of national and state organisations that have a policy interest in student-working in Australia. The interviewees included representatives from employer bodies and trade unions as well as government organisations. The data are used to discuss these stakeholders’ perceptions of the main advantages and disadvantages of working by young full-time students and the ways in which organisations in the business and educational sectors have adapted their policies and practices for student-working. The analysis is then used to inform a discussion about whether this is a legitimate area for public policy formulation and if so, what principles might underpin such policy and what some policies might look like.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Transmission of Plasmodium vivax malaria is dependent on vector availability, biting rates and parasite development. In turn, each of these is influenced by climatic conditions. Correlations have previously been detected between seasonal rainfall, temperature and malaria incidence patterns in various settings. An understanding of seasonal patterns of malaria, and their weather drivers, can provide vital information for control and elimination activities. This research aimed to describe temporal patterns in malaria, rainfall and temperature, and to examine the relationships between these variables within four counties of Yunnan Province, China. Methods Plasmodium vivax malaria surveillance data (1991–2006), and average monthly temperature and rainfall were acquired. Seasonal trend decomposition was used to examine secular trends and seasonal patterns in malaria. Distributed lag non-linear models were used to estimate the weather drivers of malaria seasonality, including the lag periods between weather conditions and malaria incidence. Results There was a declining trend in malaria incidence in all four counties. Increasing temperature resulted in increased malaria risk in all four areas and increasing rainfall resulted in increased malaria risk in one area and decreased malaria risk in one area. The lag times for these associations varied between areas. Conclusions The differences detected between the four counties highlight the need for local understanding of seasonal patterns of malaria and its climatic drivers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective:  To examine the space-time clustering of dengue fever (DF) transmission in Bangladesh using geographical information system and spatial scan statistics (SaTScan). Methods:  We obtained data on monthly suspected DF cases and deaths by district in Bangladesh for the period of 2000–2009 from Directorate General of Health Services. Population and district boundary data of each district were collected from national census managed by Bangladesh Bureau of Statistics. To identify the space-time clusters of DF transmission a discrete Poisson model was performed using SaTScan software. Results:  Space-time distribution of DF transmission was clustered during three periods 2000–2002, 2003–2005 and 2006–2009. Dhaka was the most likely cluster for DF in all three periods. Several other districts were significant secondary clusters. However, the geographical range of DF transmission appears to have declined in Bangladesh over the last decade. Conclusion:  There were significant space-time clusters of DF in Bangladesh over the last decade. Our results would prompt future studies to explore how social and ecological factors may affect DF transmission and would also be useful for improving DF control and prevention programs in Bangladesh.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Travel time estimation and prediction on motorways has long been a topic of research. Prediction modeling generally assumes that the estimation is perfect. No matter how good is the prediction modeling- the errors in estimation can significantly deteriorate the accuracy and reliability of the prediction. Models have been proposed to estimate travel time from loop detector data. Generally, detectors are closely spaced (say 500m) and travel time can be estimated accurately. However, detectors are not always perfect, and even during normal running conditions few detectors malfunction, resulting in increase in the spacing between the functional detectors. Under such conditions, error in the travel time estimation is significantly large and generally unacceptable. This research evaluates the in-practice travel time estimation model during different traffic conditions. It is observed that the existing models fail to accurately estimate travel time during large detector spacing and congestion shoulder periods. Addressing this issue, an innovative Hybrid model that only considers loop data for travel time estimation is proposed. The model is tested using simulation and is validated with real Bluetooth data from Pacific Motorway Brisbane. Results indicate that during non free flow conditions and larger detector spacing Hybrid model provides significant improvement in the accuracy of travel time estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fiber Bragg grating (FBG) accelerometer using transverse forces is more sensitive than one using axial forces with the same mass of the inertial object, because a barely stretched FBG fixed at its two ends is much more sensitive to transverse forces than axial ones. The spring-mass theory, with the assumption that the axial force changes little during the vibration, cannot accurately predict its sensitivity and resonant frequency in the gravitational direction because the assumption does not hold due to the fact that the FBG is barely prestretched. It was modified but still required experimental verification due to the limitations in the original experiments, such as the (1) friction between the inertial object and shell; (2) errors involved in estimating the time-domain records; (3) limited data; and (4) large interval ∼5 Hz between the tested frequencies in the frequency-response experiments. The experiments presented here have verified the modified theory by overcoming those limitations. On the frequency responses, it is observed that the optimal condition for simultaneously achieving high sensitivity and resonant frequency is at the infinitesimal prestretch. On the sensitivity at the same frequency, the experimental sensitivities of the FBG accelerometer with a 5.71 gram inertial object at 6 Hz (1.29, 1.19, 0.88, 0.64, and 0.31 nm/g at the 0.03, 0.69, 1.41, 1.93, and 3.16 nm prestretches, respectively) agree with the static sensitivities predicted (1.25, 1.14, 0.83, 0.61, and 0.29 nm/g, correspondingly). On the resonant frequency, (1) its assumption that the resonant frequencies in the forced and free vibrations are similar is experimentally verified; (2) its dependence on the distance between the FBG’s fixed ends is examined, showing it to be independent; (3) the predictions of the spring-mass theory and modified theory are compared with the experimental results, showing that the modified theory predicts more accurately. The modified theory can be used more confidently in guiding its design by predicting its static sensitivity and resonant frequency, and may have applications in other fields for the scenario where the spring-mass theory fails.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usage of the mobile Internet has increased tremendously within the last couple of years, and thereby the vision of accessing information anytime, anywhere has become more realistic and a dominant design principle for providing content. However, this study challenges this paradigm of unlimited and unrestricted access, and explores the question whether constraints and restrictions can positively influence the motivation and enticement of mobile users to engage with location-specific content. Restrictions, such as a particular time or location that gives a user access to content, may be used to foster participation and engagement, as well as to support content production and to enhance the user’s experience. In order to explore this, a Mobile Narrative and a Narrative Map have been created. For the former, the access to individual chapters of the story was restricted. Authors can specify constraints, such as a location or time, which need to be met by the reader if they want to read the story. This concept allows creative writers of the story to exploit the fact that the reader’s context is known, by intensifying the user experience and integrating this knowledge into the writing process. The latter, the Narrative Map, provides users with extracts from stories or information snippets about authors at relevant locations. In both concepts, a feedback channel was also integrated, on which location, time, and size constraints were imposed. In a user-centred design process involving authors and potential readers, those concepts have been implemented, followed by an evaluation comprising four user studies. The results show that restrictions and constraints can indeed lead to more enticing and engaging user experiences, and restricted contribution opportunities can lead to a higher motivation to participate as well as to an improved quality of submissions. These findings are relevant for future developments in the area of mobile narratives and creative writing, as well as for common mobile services that aim for enticing user experiences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An Artificial Neural Network (ANN) is a computational modeling tool which has found extensive acceptance in many disciplines for modeling complex real world problems. An ANN can model problems through learning by example, rather than by fully understanding the detailed characteristics and physics of the system. In the present study, the accuracy and predictive power of an ANN was evaluated in predicting kinetic viscosity of biodiesels over a wide range of temperatures typically encountered in diesel engine operation. In this model, temperature and chemical composition of biodiesel were used as input variables. In order to obtain the necessary data for model development, the chemical composition and temperature dependent fuel properties of ten different types of biodiesels were measured experimentally using laboratory standard testing equipments following internationally recognized testing procedures. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture was optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the absolute fraction of variance (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found that ANN is highly accurate in predicting the viscosity of biodiesel and demonstrates the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties at different temperature levels. Therefore the model developed in this study can be a useful tool in accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Guerrilla theatre tends, by its very definition, to pop up unpredictably – it interrupts what people might see as the proper or typical flow of time, place and space. The subversive tenor of such work means that questions about ‘what has happened’ tend to the decidedly less polite form of ‘WTF’ as passersby struggle to make sense of, and move on from, moments in which accustomed narratives of action and interaction no longer apply. In this paper I examine examples of guerrilla theatre by performers with disabilities in terms of these ruptures in time, and the way they prompt reflection, reconfigure relations, or recede into traditional relations again - focusing particularly on comedian Laurence Clark. Many performers with disabilities – Bill Shannon, Katherine Araniello, Aaron Williamson, Ju Gosling, and others – find guerrilla-style interventions in public places apposite to their aesthetic and political agendas. They prompt passersby to reflect on their relationship to people with disabilities. They can be recorded for later dissection and display, teaching people something about the way social performers, social spectators and society as a whole deal with disability. In this paper, as I unpack Clark's work, I note that the embarrassment that characterises these encounters can be a flag of an ethical process taking place for passersby. Caught between two moments in which time, roles and relationships suddenly fail to flow along the smooth routes of socially determined habits, passersbys’ frowns, gasps and giggles flag difficulties dealing with questions about their attitude to disabled people they do not now know how to answer. I consider the productivity, politics and performerly ethics of drawing passersby into such a process – a chaotic, challenging interstitial time in which a passersbys choices become fodder for public consumption – in such a wholly public way.