975 resultados para Average models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-image descriptors such as GIST have been used successfully for persistent place recognition when combined with temporal filtering or sequential filtering techniques. However, whole-image descriptor localization systems often apply a heuristic rather than a probabilistic approach to place recognition, requiring substantial environmental-specific tuning prior to deployment. In this paper we present a novel online solution that uses statistical approaches to calculate place recognition likelihoods for whole-image descriptors, without requiring either environmental tuning or pre-training. Using a real world benchmark dataset, we show that this method creates distributions appropriate to a specific environment in an online manner. Our method performs comparably to FAB-MAP in raw place recognition performance, and integrates into a state of the art probabilistic mapping system to provide superior performance to whole-image methods that are not based on true probability distributions. The method provides a principled means for combining the powerful change-invariant properties of whole-image descriptors with probabilistic back-end mapping systems without the need for prior training or system tuning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important aspect of robotic path planning for is ensuring that the vehicle is in the best location to collect the data necessary for the problem at hand. Given that features of interest are dynamic and move with oceanic currents, vehicle speed is an important factor in any planning exercises to ensure vehicles are at the right place at the right time. Here, we examine different Gaussian process models to find a suitable predictive kinematic model that enable the speed of an underactuated, autonomous surface vehicle to be accurately predicted given a set of input environmental parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Topic modelling, such as Latent Dirichlet Allocation (LDA), was proposed to generate statistical models to represent multiple topics in a collection of documents, which has been widely utilized in the fields of machine learning and information retrieval, etc. But its effectiveness in information filtering is rarely known. Patterns are always thought to be more representative than single terms for representing documents. In this paper, a novel information filtering model, Pattern-based Topic Model(PBTM) , is proposed to represent the text documents not only using the topic distributions at general level but also using semantic pattern representations at detailed specific level, both of which contribute to the accurate document representation and document relevance ranking. Extensive experiments are conducted to evaluate the effectiveness of PBTM by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model achieves outstanding performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring stream networks through time provides important ecological information. The sampling design problem is to choose locations where measurements are taken so as to maximise information gathered about physicochemical and biological variables on the stream network. This paper uses a pseudo-Bayesian approach, averaging a utility function over a prior distribution, in finding a design which maximizes the average utility. We use models for correlations of observations on the stream network that are based on stream network distances and described by moving average error models. Utility functions used reflect the needs of the experimenter, such as prediction of location values or estimation of parameters. We propose an algorithmic approach to design with the mean utility of a design estimated using Monte Carlo techniques and an exchange algorithm to search for optimal sampling designs. In particular we focus on the problem of finding an optimal design from a set of fixed designs and finding an optimal subset of a given set of sampling locations. As there are many different variables to measure, such as chemical, physical and biological measurements at each location, designs are derived from models based on different types of response variables: continuous, counts and proportions. We apply the methodology to a synthetic example and the Lake Eacham stream network on the Atherton Tablelands in Queensland, Australia. We show that the optimal designs depend very much on the choice of utility function, varying from space filling to clustered designs and mixtures of these, but given the utility function, designs are relatively robust to the type of response variable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Transmission of Plasmodium vivax malaria is dependent on vector availability, biting rates and parasite development. In turn, each of these is influenced by climatic conditions. Correlations have previously been detected between seasonal rainfall, temperature and malaria incidence patterns in various settings. An understanding of seasonal patterns of malaria, and their weather drivers, can provide vital information for control and elimination activities. This research aimed to describe temporal patterns in malaria, rainfall and temperature, and to examine the relationships between these variables within four counties of Yunnan Province, China. Methods Plasmodium vivax malaria surveillance data (1991–2006), and average monthly temperature and rainfall were acquired. Seasonal trend decomposition was used to examine secular trends and seasonal patterns in malaria. Distributed lag non-linear models were used to estimate the weather drivers of malaria seasonality, including the lag periods between weather conditions and malaria incidence. Results There was a declining trend in malaria incidence in all four counties. Increasing temperature resulted in increased malaria risk in all four areas and increasing rainfall resulted in increased malaria risk in one area and decreased malaria risk in one area. The lag times for these associations varied between areas. Conclusions The differences detected between the four counties highlight the need for local understanding of seasonal patterns of malaria and its climatic drivers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous studies have demonstrated the importance of weather variables in influencing the incidence of influenza. However, the role of air pollution is often ignored in identifying the environmental drivers of influenza. This research aims to examine the impacts of air pollutants and temperature on the incidence of pediatric influenza in Brisbane, Australia. Lab-confirmed daily data on influenza counts among children aged 0-14years in Brisbane from 2001 January 1st to 2008 December 31st were retrieved from Queensland Health. Daily data on maximum and minimum temperatures for the same period were supplied by the Australian Bureau of Meteorology. Winter was chosen as the main study season due to it having the highest pediatric influenza incidence. Four Poisson log-linear regression models, with daily pediatric seasonal influenza counts as the outcome, were used to examine the impacts of air pollutants (i.e., ozone (O3), particulate matter≤10μm (PM10) and nitrogen dioxide (NO2)) and temperature (using a moving average of ten days for these variables) on pediatric influenza. The results show that mean temperature (Relative risk (RR): 0.86; 95% Confidence Interval (CI): 0.82-0.89) was negatively associated with pediatric seasonal influenza in Brisbane, and high concentrations of O3 (RR: 1.28; 95% CI: 1.25-1.31) and PM10 (RR: 1.11; 95% CI: 1.10-1.13) were associated with more pediatric influenza cases. There was a significant interaction effect (RR: 0.94; 95% CI: 0.93-0.95) between PM10 and mean temperature on pediatric influenza. Adding the interaction term between mean temperature and PM10 substantially improved the model fit. This study provides evidence that PM10 needs to be taken into account when evaluating the temperature-influenza relationship. O3 was also an important predictor, independent of temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES To identify the meteorological drivers of dengue vector density and determine high- and low-risk transmission zones for dengue prevention and control in Cairns, Australia. METHODS Weekly adult female Ae. aegypti data were obtained from 79 double sticky ovitraps (SOs) located in Cairns for the period September 2007-May 2012. Maximum temperature, total rainfall and average relative humidity data were obtained from the Australian Bureau of Meteorology for the study period. Time series-distributed lag nonlinear models were used to assess the relationship between meteorological variables and vector density. Spatial autocorrelation was assessed via semivariography, and ordinary kriging was undertaken to predict vector density in Cairns. RESULTS Ae. aegypti density was associated with temperature and rainfall. However, these relationships differed between short (0-6 weeks) and long (0-30 weeks) lag periods. Semivariograms showed that vector distributions were spatially autocorrelated in September 2007-May 2008 and January 2009-May 2009, and vector density maps identified high transmission zones in the most populated parts of Cairns city, as well as Machans Beach. CONCLUSION Spatiotemporal patterns of Ae. aegypti in Cairns are complex, showing spatial autocorrelation and associations with temperature and rainfall. Sticky ovitraps should be placed no more than 1.2 km apart to ensure entomological coverage and efficient use of resources. Vector density maps provide evidence for the targeting of prevention and control activities. Further research is needed to explore the possibility of developing an early warning system of dengue based on meteorological and environmental factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Physical activity, particularly walking, is greatly beneficial to health; yet a sizeable proportion of older adults are insufficiently active. The importance of built environment attributes for walking is known, but few studies of older adults have examined neighbourhood destinations and none have investigated access to specific, objectively-measured commercial destinations and walking. METHODS: We undertook a secondary analysis of data from the Western Australian state government's health surveillance survey for those aged 65--84 years and living in the Perth metropolitan region from 2003--2009 (n = 2,918). Individual-level road network service areas were generated at 400 m and 800 m distances, and the presence or absence of six commercial destination types within the neighbourhood service areas identified (food retail, general retail, medical care services, financial services, general services, and social infrastructure). Adjusted logistic regression models examined access to and mix of commercial destination types within neighbourhoods for associations with self-reported walking behaviour. RESULTS: On average, the sample was aged 72.9 years (SD = 5.4), and was predominantly female (55.9%) and married (62.0%). Overall, 66.2% reported some weekly walking and 30.8% reported sufficient walking (>=150 min/week). Older adults with access to general services within 400 m (OR = 1.33, 95% CI = 1.07-1.66) and 800 m (OR = 1.20, 95% CI = 1.02-1.42), and social infrastructure within 800 m (OR = 1.19, 95% CI = 1.01-1.40) were more likely to engage in some weekly walking. Access to medical care services within 400 m (OR = 0.77, 95% CI = 0.63-0.93) and 800 m (OR = 0.83, 95% CI = 0.70-0.99) reduced the odds of sufficient walking. Access to food retail, general retail, financial services, and the mix of commercial destination types within the neighbourhood were all unrelated to walking. CONCLUSIONS: The types of neighbourhood commercial destinations that encourage older adults to walk appear to differ slightly from those reported for adult samples. Destinations that facilitate more social interaction, for example eating at a restaurant or church involvement, or provide opportunities for some incidental social contact, for example visiting the pharmacy or hairdresser, were the strongest predictors for walking among seniors in this study. This underscores the importance of planning neighbourhoods with proximate access to social infrastructure, and highlights the need to create residential environments that support activity across the life course.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Caveolae and their proteins, the caveolins, transport macromolecules; compartmentalize signalling molecules; and are involved in various repair processes. There is little information regarding their role in the pathogenesis of significant renal syndromes such as acute renal failure (ARF). In this study, an in vivo rat model of 30 min bilateral renal ischaemia followed by reperfusion times from 4 h to 1 week was used to map the temporal and spatial association between caveolin-1 and tubular epithelial damage (desquamation, apoptosis, necrosis). An in vitro model of ischaemic ARF was also studied, where cultured renal tubular epithelial cells or arterial endothelial cells were subjected to injury initiators modelled on ischaemia-reperfusion (hypoxia, serum deprivation, free radical damage or hypoxia-hyperoxia). Expression of caveolin proteins was investigated using immunohistochemistry, immunoelectron microscopy, and immunoblots of whole cell, membrane or cytosol protein extracts. In vivo, healthy kidney had abundant caveolin-1 in vascular endothelial cells and also some expression in membrane surfaces of distal tubular epithelium. In the kidneys of ARF animals, punctate cytoplasmic localization of caveolin-1 was identified, with high intensity expression in injured proximal tubules that were losing basement membrane adhesion or were apoptotic, 24 h to 4 days after ischaemia-reperfusion. Western immunoblots indicated a marked increase in caveolin-1 expression in the cortex where some proximal tubular injury was located. In vitro, the main treatment-induced change in both cell types was translocation of caveolin-1 from the original plasma membrane site into membrane-associated sites in the cytoplasm. Overall, expression levels did not alter for whole cell extracts and the protein remained membrane-bound, as indicated by cell fractionation analyses. Caveolin-1 was also found to localize intensely within apoptotic cells. The results are indicative of a role for caveolin-1 in ARF-induced renal injury. Whether it functions for cell repair or death remains to be elucidated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social networking sites (SNSs), with their large numbers of users and large information base, seem to be perfect breeding grounds for exploiting the vulnerabilities of people, the weakest link in security. Deceiving, persuading, or influencing people to provide information or to perform an action that will benefit the attacker is known as “social engineering.” While technology-based security has been addressed by research and may be well understood, social engineering is more challenging to understand and manage, especially in new environments such as SNSs, owing to some factors of SNSs that reduce the ability of users to detect the attack and increase the ability of attackers to launch it. This work will contribute to the knowledge of social engineering by presenting the first two conceptual models of social engineering attacks in SNSs. Phase-based and source-based models are presented, along with an intensive and comprehensive overview of different aspects of social engineering threats in SNSs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe recent biologically-inspired mapping research incorporating brain-based multi-sensor fusion and calibration processes and a new multi-scale, homogeneous mapping framework. We also review the interdisciplinary approach to the development of the RatSLAM robot mapping and navigation system over the past decade and discuss the insights gained from combining pragmatic modelling of biological processes with attempts to close the loop back to biology. Our aim is to encourage the pursuit of truly interdisciplinary approaches to robotics research by providing successful case studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a model-predictive control (MPC) method is detailed for the control of nonlinear systems with stability considerations. It will be assumed that the plant is described by a local input/output ARX-type model, with the control potentially included in the premise variables, which enables the control of systems that are nonlinear in both the state and control input. Additionally, for the case of set point regulation, a suboptimal controller is derived which has the dual purpose of ensuring stability and enabling finite-iteration termination of the iterative procedure used to solve the nonlinear optimization problem that is used to determine the control signal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.