65 resultados para sequential-move contest
Resumo:
Radiometric data in the visible domain acquired by satellite remote sensing have proven to be powerful for monitoring the states of the ocean, both physical and biological. With the help of these data it is possible to understand certain variations in biological responses of marine phytoplankton on ecological time scales. Here, we implement a sequential data-assimilation technique to estimate from a conventional nutrient–phytoplankton–zooplankton (NPZ) model the time variations of observed and unobserved variables. In addition, we estimate the time evolution of two biological parameters, namely, the specific growth rate and specific mortality of phytoplankton. Our study demonstrates that: (i) the series of time-varying estimates of specific growth rate obtained by sequential data assimilation improves the fitting of the NPZ model to the satellite-derived time series: the model trajectories are closer to the observations than those obtained by implementing static values of the parameter; (ii) the estimates of unobserved variables, i.e., nutrient and zooplankton, obtained from an NPZ model by implementation of a pre-defined parameter evolution can be different from those obtained on applying the sequences of parameters estimated by assimilation; and (iii) the maximum estimated specific growth rate of phytoplankton in the study area is more sensitive to the sea-surface temperature than would be predicted by temperature-dependent functions reported previously. The overall results of the study are potentially useful for enhancing our understanding of the biological response of phytoplankton in a changing environment.
Resumo:
Earth system models are increasing in complexity and incorporating more processes than their predecessors, making them important tools for studying the global carbon cycle. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes, with coupled climate-carbon cycle models that represent land-use change simulating total land carbon stores by 2100 that vary by as much as 600 Pg C given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous model evaluation methodologies. Here we assess the state-of-the-art with respect to evaluation of Earth system models, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeo data and (ii) metrics for evaluation, and discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute towards the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but it is also a challenge, as more knowledge about data uncertainties is required in order to determine robust evaluation methodologies that move the field of ESM evaluation from "beauty contest" toward the development of useful constraints on model behaviour.
Resumo:
Roots are important to plants for a wide variety of processes, including nutrient and water uptake, anchoring and mechanical support, storage functions, and as the major interface between the plant and various biotic and abiotic factors in the soil environment. Therefore, understanding the development and architecture of roots holds potential for the manipulation of root traits to improve the productivity and sustainability of agricultural systems and to better understand and manage natural ecosystems. While lateral root development is a traceable process along the primary root and different stages can be found along this longitudinal axis of time and development, root system architecture is complex and difficult to quantify. Here, we comment on assays to describe lateral root phenotypes and propose ways to move forward regarding the description of root system architecture, also considering crops and the environment.
Resumo:
Earth system models (ESMs) are increasing in complexity by incorporating more processes than their predecessors, making them potentially important tools for studying the evolution of climate and associated biogeochemical cycles. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes. For example, coupled climate–carbon cycle models that represent land-use change simulate total land carbon stores at 2100 that vary by as much as 600 Pg C, given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous methods of model evaluation. Here we assess the state-of-the-art in evaluation of ESMs, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeodata and (ii) metrics for evaluation. We note that the practice of averaging results from many models is unreliable and no substitute for proper evaluation of individual models. We discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute to the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but also presents a challenge. Improved knowledge of data uncertainties is still necessary to move the field of ESM evaluation away from a "beauty contest" towards the development of useful constraints on model outcomes.
Resumo:
The present study compared production and on-line comprehension of definite articles and third person direct object clitic pronouns in Greek-speaking typically developing, sequential bilingual (L2-TD) children and monolingual children with specific language impairment (L1-SLI). Twenty Turkish Greek L2-TD children, 16 Greek L1-SLI children, and 31 L1-TD Greek children participated in a production task examining definite articles and clitic pronouns and, in an on-line comprehension task, involving grammatical sentences with definite articles and clitics and sentences with grammatical violations induced by omitted articles and clitics. The results showed that the L2-TD children were sensitive to the grammatical violations despite low production. In contrast, the children with SLI were not sensitive to clitic omission in the on-line task, despite high production. These results support a dissociation between production and on-line comprehension in L2 children and for impaired grammatical representations and lack of automaticity in children with SLI. They also suggest that on-line comprehension tasks may complement production tasks by differentiating between the language profiles of L2-TD children and children with SLI.
Resumo:
Organisations typically define and execute their selected strategy by developing and managing a portfolio of projects. The governance of this portfolio has proved to be a major challenge, particularly for large organisations. Executives and managers face even greater pressures when the nature of the strategic landscape is uncertain. This paper explores approaches for dealing with different levels of certainty in business IT projects and provides a contingent governance framework. Historically business IT projects have relied on a structured sequential approach, also referred to as a waterfall method. There is a distinction between the development stages of a solution and the management stages of a project that delivers the solution although these are often integrated in a business IT systems project. Prior research has demonstrated that the level of certainty varies between development projects. There can be uncertainty on what needs to be developed and also on how this solution should be developed. The move to agile development and management reflects a greater level of uncertainty often on both dimensions and this has led the adoption of more iterative approaches. What has been less well researched is the impact of uncertainty on the governance of the change portfolio and the corresponding implications for business executives. This paper poses this research question and proposes a govemance framework to address these aspects. The governance framework has been reviewed in the context of a major anonymous organisation, FinOrg. Findings are reported in this paper with a focus on the need to apply different approaches. In particular, the governance of uncertain business change is contrasted with the management approach for defined IT projects. Practical outputs from the paper include a consideration of some innovative approaches that can be used by executives. It also investigates the role of the business change portfolio group in evaluating and executing the appropriate level of governance. These results lead to recommendations for executives and also proposed further research.
Resumo:
The most popular endgame tables (EGTs) documenting ‘DTM’ Depth to Mate in chess endgames are those of Eugene Nalimov but these do not recognise the FIDE 50-move rule ‘50mr’. This paper marks the creation by the first author of EGTs for sub-6-man (s6m) chess and beyond which give DTM as affected by the ply count pc. The results are put into the context of previous work recognising the 50mr and are compared with the original unmoderated DTM results. The work is also notable for being the first EGT generation work to use the functional programming language HASKELL.
Resumo:
Levels of mobility in the Roman Empire have long been assumed to be relatively high, as attested by epigraphy, demography, material culture and, most recently, isotope analysis and the skeletons themselves. Building on recent data from a range of Romano-British sites (Poundbury in Dorset, York, Winchester, Gloucester, Catterick and Scorton), this article explores the significance of the presence of migrants at these sites and the impact they may have had on their host societies. The authors explore the usefulness of diaspora theory, and in particular the concept of imperial and colonial diasporas, to illustrate the complexities of identities in later Roman Britain.
Resumo:
This paper presents a novel mobile sink area allocation scheme for consumer based mobile robotic devices with a proven application to robotic vacuum cleaners. In the home or office environment, rooms are physically separated by walls and an automated robotic cleaner cannot make a decision about which room to move to and perform the cleaning task. Likewise, state of the art cleaning robots do not move to other rooms without direct human interference. In a smart home monitoring system, sensor nodes may be deployed to monitor each separate room. In this work, a quad tree based data gathering scheme is proposed whereby the mobile sink physically moves through every room and logically links all separated sub-networks together. The proposed scheme sequentially collects data from the monitoring environment and transmits the information back to a base station. According to the sensor nodes information, the base station can command a cleaning robot to move to a specific location in the home environment. The quad tree based data gathering scheme minimizes the data gathering tour length and time through the efficient allocation of data gathering areas. A calculated shortest path data gathering tour can efficiently be allocated to the robotic cleaner to complete the cleaning task within a minimum time period. Simulation results show that the proposed scheme can effectively allocate and control the cleaning area to the robot vacuum cleaner without any direct interference from the consumer. The performance of the proposed scheme is then validated with a set of practical sequential data gathering tours in a typical office/home environment.
Resumo:
A general consistency in the sequential order of petroleum hydrocarbon reduction in previous biodegradation studies has led to the proposal of several molecularly based biodegradation scales. Few studies have investigated the biodegradation susceptibility of petroleum hydrocarbon products in soil media, however, and metabolic preferences can change with habitat type. A laboratory based study comprising gas chromatography–mass spectrometry (GC–MS) analysis of extracts of oil-treated soil samples incubated for up to 161 days was conducted to investigate the biodegradation of crude oil exposed to sandy soils of Barrow Island, home to both a Class ‘‘A” nature reserve and Australia’s largest on-shore oil field. Biodegradation trends of the hydrocarbon-treated soils were largely consistent with previous reports but some unusual behaviour was recognised both between and within hydrocarbon classes. For example, the n-alkanes persisted at trace levels from day 86 to 161 following the removal of typically more stable dimethyl naphthalenes and methyl phenanthrenes. The relative susceptibility to biodegradation of different di- tri- and tetramethylnaphthalene isomers also showed several features distinct from previous reports. The unique biodegradation behaviour of Barrow Is. soil likely reflects difference in microbial functioning with physiochemical variation in the environment. Correlation of molecular parameters, reduction rates of selected alkyl naphthalene isomers and CO2 respiration values with a delayed (61 d) oil-treated soil identified a slowing of biodegradation with microcosm incubation; a reduced function or population of incubated soil flora might also influence the biodegradation patterns observed.
Resumo:
The present article examines production and on-line processing of definite articles in Turkish-speaking sequential bilingual children acquiring English and Dutch as second languages (L2) in the UK and in the Netherlands, respectively. Thirty-nine 6–8-year-old L2 children and 48 monolingual (L1) age-matched children participated in two separate studies examining the production of definite articles in English and Dutch in conditions manipulating semantic context, that is, the anaphoric and the bridging contexts. Sensitivity to article omission was examined in the same groups of children using an on-line processing task involving article use in the same semantic contexts as in the production task. The results indicate that both L2 children and L1 controls are less accurate when definiteness is established by keeping track of the discourse referents (anaphoric) than when it is established via world knowledge (bridging). Moreover, despite variable production, all groups of children were sensitive to the omission of definite articles in the on-line comprehension task. This suggests that the errors of omission are not due to the lack of abstract syntactic representations, but could result from processes implicated in the spell-out of definite articles. The findings are in line with the idea that variable production in child L2 learners does not necessarily indicate lack of abstract representations (Haznedar and Schwartz, 1997).
Resumo:
Background It can be argued that adaptive designs are underused in clinical research. We have explored concerns related to inadequate reporting of such trials, which may influence their uptake. Through a careful examination of the literature, we evaluated the standards of reporting of group sequential (GS) randomised controlled trials, one form of a confirmatory adaptive design. Methods We undertook a systematic review, by searching Ovid MEDLINE from the 1st January 2001 to 23rd September 2014, supplemented with trials from an audit study. We included parallel group, confirmatory, GS trials that were prospectively designed using a Frequentist approach. Eligible trials were examined for compliance in their reporting against the CONSORT 2010 checklist. In addition, as part of our evaluation, we developed a supplementary checklist to explicitly capture group sequential specific reporting aspects, and investigated how these are currently being reported. Results Of the 284 screened trials, 68(24%) were eligible. Most trials were published in “high impact” peer-reviewed journals. Examination of trials established that 46(68%) were stopped early, predominantly either for futility or efficacy. Suboptimal reporting compliance was found in general items relating to: access to full trials protocols; methods to generate randomisation list(s); details of randomisation concealment, and its implementation. Benchmarking against the supplementary checklist, GS aspects were largely inadequately reported. Only 3(7%) trials which stopped early reported use of statistical bias correction. Moreover, 52(76%) trials failed to disclose methods used to minimise the risk of operational bias, due to the knowledge or leakage of interim results. Occurrence of changes to trial methods and outcomes could not be determined in most trials, due to inaccessible protocols and amendments. Discussion and Conclusions There are issues with the reporting of GS trials, particularly those specific to the conduct of interim analyses. Suboptimal reporting of bias correction methods could potentially imply most GS trials stopping early are giving biased results of treatment effects. As a result, research consumers may question credibility of findings to change practice when trials are stopped early. These issues could be alleviated through a CONSORT extension. Assurance of scientific rigour through transparent adequate reporting is paramount to the credibility of findings from adaptive trials. Our systematic literature search was restricted to one database due to resource constraints.
Resumo:
Recruitment of patients to a clinical trial usually occurs over a period of time, resulting in the steady accumulation of data throughout the trial's duration. Yet, according to traditional statistical methods, the sample size of the trial should be determined in advance, and data collected on all subjects before analysis proceeds. For ethical and economic reasons, the technique of sequential testing has been developed to enable the examination of data at a series of interim analyses. The aim is to stop recruitment to the study as soon as there is sufficient evidence to reach a firm conclusion. In this paper we present the advantages and disadvantages of conducting interim analyses in phase III clinical trials, together with the key steps to enable the successful implementation of sequential methods in this setting. Examples are given of completed trials, which have been carried out sequentially, and references to relevant literature and software are provided.