972 resultados para Adaptive methods
Resumo:
Heterogeneous multi-core FPGAs contain different types of cores, which can improve efficiency when used with an effective online task scheduler. However, it is not easy to find the right cores for tasks when there are multiple objectives or dozens of cores. Inappropriate scheduling may cause hot spots which decrease the reliability of the chip. Given that, our research builds a simulating platform to evaluate all kinds of scheduling algorithms on a variety of architectures. On this platform, we provide an online scheduler which uses multi-objective evolutionary algorithm (EA). Comparing the EA and current algorithms such as Predictive Dynamic Thermal Management (PDTM) and Adaptive Temperature Threshold Dynamic Thermal Management (ATDTM), we find some drawbacks in previous work. First, current algorithms are overly dependent on manually set constant parameters. Second, those algorithms neglect optimization for heterogeneous architectures. Third, they use single-objective methods, or use linear weighting method to convert a multi-objective optimization into a single-objective optimization. Unlike other algorithms, the EA is adaptive and does not require resetting parameters when workloads switch from one to another. EAs also improve performance when used on heterogeneous architecture. A efficient Pareto front can be obtained with EAs for the purpose of multiple objectives.
Resumo:
Heuristics, simulation, artificial intelligence techniques and combinations thereof have all been employed in the attempt to make computer systems adaptive, context-aware, reconfigurable and self-managing. This paper complements such efforts by exploring the possibility to achieve runtime adaptiveness using mathematically-based techniques from the area of formal methods. It is argued that formal methods @ runtime represents a feasible approach, and promising preliminary results are summarised to support this viewpoint. The survey of existing approaches to employing formal methods at runtime is accompanied by a discussion of their challenges and of the future research required to overcome them. © 2011 Springer-Verlag.
Resumo:
Airborne Light Detection and Ranging (LIDAR) technology has become the primary method to derive high-resolution Digital Terrain Models (DTMs), which are essential for studying Earth's surface processes, such as flooding and landslides. The critical step in generating a DTM is to separate ground and non-ground measurements in a voluminous point LIDAR dataset, using a filter, because the DTM is created by interpolating ground points. As one of widely used filtering methods, the progressive morphological (PM) filter has the advantages of classifying the LIDAR data at the point level, a linear computational complexity, and preserving the geometric shapes of terrain features. The filter works well in an urban setting with a gentle slope and a mixture of vegetation and buildings. However, the PM filter often removes ground measurements incorrectly at the topographic high area, along with large sizes of non-ground objects, because it uses a constant threshold slope, resulting in "cut-off" errors. A novel cluster analysis method was developed in this study and incorporated into the PM filter to prevent the removal of the ground measurements at topographic highs. Furthermore, to obtain the optimal filtering results for an area with undulating terrain, a trend analysis method was developed to adaptively estimate the slope-related thresholds of the PM filter based on changes of topographic slopes and the characteristics of non-terrain objects. The comparison of the PM and generalized adaptive PM (GAPM) filters for selected study areas indicates that the GAPM filter preserves the most "cut-off" points removed incorrectly by the PM filter. The application of the GAPM filter to seven ISPRS benchmark datasets shows that the GAPM filter reduces the filtering error by 20% on average, compared with the method used by the popular commercial software TerraScan. The combination of the cluster method, adaptive trend analysis, and the PM filter allows users without much experience in processing LIDAR data to effectively and efficiently identify ground measurements for the complex terrains in a large LIDAR data set. The GAPM filter is highly automatic and requires little human input. Therefore, it can significantly reduce the effort of manually processing voluminous LIDAR measurements.
Resumo:
Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.
This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.
In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.
Resumo:
Background: The relationship between mental health and climate change are poorly understood. Participatory methods represent ethical, feasible, and culturally-appropriate approaches to engage community members for mental health promotion in the context of climate change. Aim: Photovoice, a community-based participatory research methodology uses images as a tool to deconstruct problems by posing meaningful questions in a community to find actionable solutions. This community-enhancing technique was used to elicit experiences of climate change among women in rural Nepal and the association of climate change with mental health. Subjects and methods: Mixed-methods, including in-depth interviews and self-report questionnaires, were used to evaluate the experience of 10 women participating in photovoice. Quantitative tools included Nepali versions of Beck Depression Inventory (BDI) and Beck Anxiety Inventory (BAI) and a resilience scale. Results: In qualitative interviews after photovoice, women reported climate change adaptation and behavior change strategies including environmental knowledge-sharing, group mobilization, and increased hygiene practices. Women also reported beneficial effects for mental health. The mean BDI score prior to photovoice was 23.20 (SD=9.00) and two weeks after completion of photovoice, the mean BDI score was 7.40 (SD=7.93), paired t-test = 8.02, p<.001, n=10. Conclusion: Photovoice, as a participatory method, has potential to inform resources, adaptive strategies and potential interventions to for climate change and mental health.
Resumo:
Optimal assistance of an adult, adapted to the current level of understanding of the student (scaffolding), can help students with emotional and behavioural problems (EBD) to demonstrate a similar level of understanding on scientific tasks, compared to students from regular education (Van Der Steen, Steenbeek, Wielinski & Van Geert, 2012). In the present study the optimal scaffolding techniques for EBD students were investigated, as well as how these differ from scaffolding techniques used for regular students. A researcher visited five EBD students and five regular students (aged three to six years old) three times in a 1,5 years period. Student and researcher worked together on scientific tasks about gravity and air pressure, while the researcher asked questions. An adaptive protocol was used, so that all children were asked the same basic questions about the mechanisms of the task. Beside this, the researcher was also allowed to ask follow-up questions and use scaffolding methods when these seemed necessary. We found a bigger amount of scaffolding in the group of EBD students compared to the regular students. The scaffolding techniques that were used also differed between the two groups. For EBD students, we saw more scaffolding strategies focused on keeping the student committed to the task, and less strategies aimed at the relationship between the child and the researcher. Furthermore, in the group of regular students we saw a decreasing trend in the amount of scaffolding over the course of three visits. This trend was not visible for the EBD students. These results highlight the importance for using different scaffolding strategies when working with EBD students compared to regular students. Future research can give a clearer image of the differences in scaffolding needs between these two groups.
Resumo:
During the epoch when the first collapsed structures formed (6<z<50) our Universe went through an extended period of changes. Some of the radiation from the first stars and accreting black holes in those structures escaped and changed the state of the Intergalactic Medium (IGM). The era of this global phase change in which the state of the IGM was transformed from cold and neutral to warm and ionized, is called the Epoch of Reionization.In this thesis we focus on numerical methods to calculate the effects of this escaping radiation. We start by considering the performance of the cosmological radiative transfer code C2-Ray. We find that although this code efficiently and accurately solves for the changes in the ionized fractions, it can yield inaccurate results for the temperature changes. We introduce two new elements to improve the code. The first element, an adaptive time step algorithm, quickly determines an optimal time step by only considering the computational cells relevant for this determination. The second element, asynchronous evolution, allows different cells to evolve with different time steps. An important constituent of methods to calculate the effects of ionizing radiation is the transport of photons through the computational domain or ``ray-tracing''. We devise a novel ray tracing method called PYRAMID which uses a new geometry - the pyramidal geometry. This geometry shares properties with both the standard Cartesian and spherical geometries. This makes it on the one hand easy to use in conjunction with a Cartesian grid and on the other hand ideally suited to trace radiation from a radially emitting source. A time-dependent photoionization calculation not only requires tracing the path of photons but also solving the coupled set of photoionization and thermal equations. Several different solvers for these equations are in use in cosmological radiative transfer codes. We conduct a detailed and quantitative comparison of four different standard solvers in which we evaluate how their accuracy depends on the choice of the time step. This comparison shows that their performance can be characterized by two simple parameters and that the C2-Ray generally performs best.
Resumo:
This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
In parallel adaptive finite element simulations the work load on the individual processors may change frequently. To (re)distribute the load evenly over the processors a load balancing heuristic is needed. Common strategies try to minimise subdomain dependencies by optimising the cutsize of the partitioning. However for certain solvers cutsize only plays a minor role, and their convergence is highly dependent on the subdomain shapes. Degenerated subdomain shapes cause them to need significantly more iterations to converge. In this work a new parallel load balancing strategy is introduced which directly addresses the problem of generating and conserving reasonably good subdomain shapes in a dynamically changing Finite Element Simulation. Geometric data is used to formulate several cost functions to rate elements in terms of their suitability to be migrated. The well known diffusive method which calculates the necessary load flow is enhanced by weighting the subdomain edges with the help of these cost functions. The proposed methods have been tested and results are presented.
Resumo:
In this article we consider the application of the generalization of the symmetric version of the interior penalty discontinuous Galerkin finite element method to the numerical approximation of the compressible Navier--Stokes equations. In particular, we consider the a posteriori error analysis and adaptive mesh design for the underlying discretization method. Indeed, by employing a duality argument (weighted) Type I a posteriori bounds are derived for the estimation of the error measured in terms of general target functionals of the solution; these error estimates involve the product of the finite element residuals with local weighting terms involving the solution of a certain dual problem that must be numerically approximated. This general approach leads to the design of economical finite element meshes specifically tailored to the computation of the target functional of interest, as well as providing efficient error estimation. Numerical experiments demonstrating the performance of the proposed approach will be presented.
Resumo:
A key driver of Australian sweetpotato productivity improvements and consumer demand has been industry adoption of disease-free planting material systems. On a farm isolated from main Australian sweetpotato areas, virus-free germplasm is annually multiplied, with subsequent 'pathogen-tested' (PT) sweetpotato roots shipped to commercial Australian sweetpotato growers. They in turn plant their PT roots into specially designated plant beds, commencing in late winter. From these beds, they cut sprouts as the basis for their commercial fields. Along with other intense agronomic practices, this system enables Australian producers to achieve worldRSQUOs highest commercial yields (per hectare) of premium sweetpotatoes. Their industry organisation, ASPG (Australian Sweetpotato Growers Inc.), has identified productivity of mother plant beds as a key driver of crop performance. Growers and scientists are currently collaborating to investigate issues such as catastrophic plant beds losses; optimisation of irrigation and nutrient addition; rapidity and uniformity of initial plant bed harvests; optimal plant bed harvest techniques; virus re-infection of plant beds; and practical longevity of plant beds. A survey of 50 sweetpotato growers in Queensland and New South Wales identified a substantial diversity in current plant bed systems, apparently influenced by growing district, scale of operation, time of planting, and machinery/labour availability. Growers identified key areas for plant bed research as: optimising the size and grading specifications of PT roots supplied for the plant beds; change in sprout density, vigour and performance through sequential cuttings of the plant bed; optimal height above ground level to cut sprouts to maximise commercial crop and plant bed performance; and use of structures and soil amendments in plant bed systems. Our ongoing multi-disciplinary research program integrates detailed agronomic experiments, grower adaptive learning sites, product quality and consumer research, to enhance industry capacity for inspired innovation and commercial, sustainable practice change.