219 resultados para Adaptive algorithms
Resumo:
Background Appropriately conducted adaptive designs (ADs) offer many potential advantages over conventional trials. They make better use of accruing data, potentially saving time, trial participants, and limited resources compared to conventional, fixed sample size designs. However, one can argue that ADs are not implemented as often as they should be, particularly in publicly funded confirmatory trials. This study explored barriers, concerns, and potential facilitators to the appropriate use of ADs in confirmatory trials among key stakeholders. Methods We conducted three cross-sectional, online parallel surveys between November 2014 and January 2015. The surveys were based upon findings drawn from in-depth interviews of key research stakeholders, predominantly in the UK, and targeted Clinical Trials Units (CTUs), public funders, and private sector organisations. Response rates were as follows: 30(55 %) UK CTUs, 17(68 %) private sector, and 86(41 %) public funders. A Rating Scale Model was used to rank barriers and concerns in order of perceived importance for prioritisation. Results Top-ranked barriers included the lack of bridge funding accessible to UK CTUs to support the design of ADs, limited practical implementation knowledge, preference for traditional mainstream designs, difficulties in marketing ADs to key stakeholders, time constraints to support ADs relative to competing priorities, lack of applied training, and insufficient access to case studies of undertaken ADs to facilitate practical learning and successful implementation. Associated practical complexities and inadequate data management infrastructure to support ADs were reported as more pronounced in the private sector. For funders of public research, the inadequate description of the rationale, scope, and decision-making criteria to guide the planned AD in grant proposals by researchers were all viewed as major obstacles. Conclusions There are still persistent and important perceptions of individual and organisational obstacles hampering the use of ADs in confirmatory trials research. Stakeholder perceptions about barriers are largely consistent across sectors, with a few exceptions that reflect differences in organisations’ funding structures, experiences and characterisation of study interventions. Most barriers appear connected to a lack of practical implementation knowledge and applied training, and limited access to case studies to facilitate practical learning. Keywords: Adaptive designs; flexible designs; barriers; surveys; confirmatory trials; Phase 3; clinical trials; early stopping; interim analyses
Resumo:
The pipe sizing of water networks via evolutionary algorithms is of great interest because it allows the selection of alternative economical solutions that meet a set of design requirements. However, available evolutionary methods are numerous, and methodologies to compare the performance of these methods beyond obtaining a minimal solution for a given problem are currently lacking. A methodology to compare algorithms based on an efficiency rate (E) is presented here and applied to the pipe-sizing problem of four medium-sized benchmark networks (Hanoi, New York Tunnel, GoYang and R-9 Joao Pessoa). E numerically determines the performance of a given algorithm while also considering the quality of the obtained solution and the required computational effort. From the wide range of available evolutionary algorithms, four algorithms were selected to implement the methodology: a PseudoGenetic Algorithm (PGA), Particle Swarm Optimization (PSO), a Harmony Search and a modified Shuffled Frog Leaping Algorithm (SFLA). After more than 500,000 simulations, a statistical analysis was performed based on the specific parameters each algorithm requires to operate, and finally, E was analyzed for each network and algorithm. The efficiency measure indicated that PGA is the most efficient algorithm for problems of greater complexity and that HS is the most efficient algorithm for less complex problems. However, the main contribution of this work is that the proposed efficiency ratio provides a neutral strategy to compare optimization algorithms and may be useful in the future to select the most appropriate algorithm for different types of optimization problems.
Resumo:
This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.
Resumo:
This paper aims to critically examine the application of Predicted Mean Vote (PMV) in an air-conditioned environment in the hot-humid climate region. Experimental studies have been conducted in a climate chamber in Chongqing, China, from 2008 to 2010. A total of 440 thermal responses from participants were obtained. Data analysis reveals that the PMV overestimates occupants' mean thermal sensation in the warm environment (PMV > 0) with a mean bias of 0.296 in accordance with the ASHRAE thermal sensation scales. The Bland–Altman method has been applied to assess the agreement of the PMV and Actual Mean Vote (AMV) and reveals a lack of agreement between them. It is identified that habituation due to the past thermal experience of a long-term living in a specific region could stimulate psychological adaptation. The psychological adaptation can neutralize occupants’ actual thermal sensation by moderating the thermal sensibility of the skin. A thermal sensation empirical model and a PMV-revised index are introduced for air-conditioned indoor environments in hot-humid regions. As a result of habituation, the upper limit effective thermal comfort temperature SET* can be increased by 1.6 °C in a warm season based on the existing international standard. As a result, a great potential for energy saving from the air-conditioning system in summer could be achieved.
Resumo:
In order to gain insights into events and issues that may cause errors and outages in parts of IP networks, intelligent methods that capture and express causal relationships online (in real-time) are needed. Whereas generalised rule induction has been explored for non-streaming data applications, its application and adaptation on streaming data is mostly undeveloped or based on periodic and ad-hoc training with batch algorithms. Some association rule mining approaches for streaming data do exist, however, they can only express binary causal relationships. This paper presents the ongoing work on Online Generalised Rule Induction (OGRI) in order to create expressive and adaptive rule sets real-time that can be applied to a broad range of applications, including network telemetry data streams.
Resumo:
The notion that large body size confers some intrinsic advantage to biological species has been debated for centuries. Using a phylogenetic statistical approach that allows the rate of body size evolution to vary across a phylogeny, we find a long-term directional bias toward increasing size in the mammals. This pattern holds separately in 10 of 11 orders for which sufficient data are available and arises from a tendency for accelerated rates of evolution to produce increases, but not decreases, in size. On a branch-by-branch basis, increases in body size have been more than twice as likely as decreases, yielding what amounts to millions and millions of years of rapid and repeated increases in size away from the small ancestral mammal. These results are the first evidence, to our knowledge, from extant species that are compatible with Cope’s rule: the pattern of body size increase through time observed in the mammalian fossil record. We show that this pattern is unlikely to be explained by several nonadaptive mechanisms for increasing size and most likely represents repeated responses to new selective circumstances. By demonstrating that it is possible to uncover ancient evolutionary trends from a combination of a phylogeny and appropriate statistical models, we illustrate how data from extant species can complement paleontological accounts of evolutionary history, opening up new avenues of investigation for both.
Resumo:
Climate change poses new challenges to cities and new flexible forms of governance are required that are able to take into account the uncertainty and abruptness of changes. The purpose of this paper is to discuss adaptive climate change governance for urban resilience. This paper identifies and reviews three traditions of literature on the idea of transitions and transformations, and assesses to what extent the transitions encompass elements of adaptive governance. This paper uses the open source Urban Transitions Project database to assess how urban experiments take into account principles of adaptive governance. The results show that: the experiments give no explicit information of ecological knowledge; the leadership of cities is primarily from local authorities; and evidence of partnerships and anticipatory or planned adaptation is limited or absent. The analysis shows that neither technological, political nor ecological solutions alone are sufficient to further our understanding of the analytical aspects of transition thinking in urban climate governance. In conclusion, the paper argues that the future research agenda for urban climate governance needs to explore further the links between the three traditions in order to better identify contradictions, complementarities or compatibilities, and what this means in practice for creating and assessing urban experiments.
Resumo:
This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.
Resumo:
In this paper, we develop a novel constrained recursive least squares algorithm for adaptively combining a set of given multiple models. With data available in an online fashion, the linear combination coefficients of submodels are adapted via the proposed algorithm.We propose to minimize the mean square error with a forgetting factor, and apply the sum to one constraint to the combination parameters. Moreover an l1-norm constraint to the combination parameters is also applied with the aim to achieve sparsity of multiple models so that only a subset of models may be selected into the final model. Then a weighted l2-norm is applied as an approximation to the l1-norm term. As such at each time step, a closed solution of the model combination parameters is available. The contribution of this paper is to derive the proposed constrained recursive least squares algorithm that is computational efficient by exploiting matrix theory. The effectiveness of the approach has been demonstrated using both simulated and real time series examples.