983 resultados para Operable Adaptive Diagnostic Scale OADS
Resumo:
The shallow water equations are solved using a mesh of polygons on the sphere, which adapts infrequently to the predicted future solution. Infrequent mesh adaptation reduces the cost of adaptation and load-balancing and will thus allow for more accurate mapping on adaptation. We simulate the growth of a barotropically unstable jet adapting the mesh every 12h. Using an adaptation criterion based largely on the gradient of the vorticity leads to a mesh with around 20 per cent of the cells of a uniform mesh that gives equivalent results. This is a similar proportion to previous studies of the same test case with mesh adaptation every 120min. The prediction of the mesh density involves solving the shallow water equations on a coarse mesh in advance of the locally refined mesh in order to estimate where features requiring higher resolution will grow, decay or move to. The adaptation criterion consists of two parts: that resolved on the coarse mesh, and that which is not resolved and so is passively advected on the coarse mesh. This combination leads to a balance between resolving features controlled by the large-scale dynamics and maintaining fine-scale features.
Resumo:
A scale-invariant moving finite element method is proposed for the adaptive solution of nonlinear partial differential equations. The mesh movement is based on a finite element discretisation of a scale-invariant conservation principle incorporating a monitor function, while the time discretisation of the resulting system of ordinary differential equations is carried out using a scale-invariant time-stepping which yields uniform local accuracy in time. The accuracy and reliability of the algorithm are successfully tested against exact self-similar solutions where available, and otherwise against a state-of-the-art h-refinement scheme for solutions of a two-dimensional porous medium equation problem with a moving boundary. The monitor functions used are the dependent variable and a monitor related to the surface area of the solution manifold. (c) 2005 IMACS. Published by Elsevier B.V. All rights reserved.
Resumo:
Midlatitude cyclones are important contributors to boundary layer ventilation. However, it is uncertain how efficient such systems are at transporting pollutants out of the boundary layer, and variations between cyclones are unexplained. In this study 15 idealized baroclinic life cycles, with a passive tracer included, are simulated to identify the relative importance of two transport processes: horizontal divergence and convergence within the boundary layer and large-scale advection by the warm conveyor belt. Results show that the amount of ventilation is insensitive to surface drag over a realistic range of values. This indicates that although boundary layer processes are necessary for ventilation they do not control the magnitude of ventilation. A diagnostic for the mass flux out of the boundary layer has been developed to identify the synoptic-scale variables controlling the strength of ascent in the warm conveyor belt. A very high level of correlation (R-2 values exceeding 0.98) is found between the diagnostic and the actual mass flux computed from the simulations. This demonstrates that the large-scale dynamics control the amount of ventilation, and the efficiency of midlatitude cyclones to ventilate the boundary layer can be estimated using the new mass flux diagnostic. We conclude that meteorological analyses, such as ERA-40, are sufficient to quantify boundary layer ventilation by the large-scale dynamics.
Resumo:
The identification of signatures of natural selection in genomic surveys has become an area of intense research, stimulated by the increasing ease with which genetic markers can be typed. Loci identified as subject to selection may be functionally important, and hence (weak) candidates for involvement in disease causation. They can also be useful in determining the adaptive differentiation of populations, and exploring hypotheses about speciation. Adaptive differentiation has traditionally been identified from differences in allele frequencies among different populations, summarised by an estimate of F-ST. Low outliers relative to an appropriate neutral population-genetics model indicate loci subject to balancing selection, whereas high outliers suggest adaptive (directional) selection. However, the problem of identifying statistically significant departures from neutrality is complicated by confounding effects on the distribution of F-ST estimates, and current methods have not yet been tested in large-scale simulation experiments. Here, we simulate data from a structured population at many unlinked, diallelic loci that are predominantly neutral but with some loci subject to adaptive or balancing selection. We develop a hierarchical-Bayesian method, implemented via Markov chain Monte Carlo (MCMC), and assess its performance in distinguishing the loci simulated under selection from the neutral loci. We also compare this performance with that of a frequentist method, based on moment-based estimates of F-ST. We find that both methods can identify loci subject to adaptive selection when the selection coefficient is at least five times the migration rate. Neither method could reliably distinguish loci under balancing selection in our simulations, even when the selection coefficient is twenty times the migration rate.
Resumo:
Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.
Resumo:
In order to improve the quality of healthcare services, the integrated large-scale medical information system is needed to adapt to the changing medical environment. In this paper, we propose a requirement driven architecture of healthcare information system with hierarchical architecture. The system operates through the mapping mechanism between these layers and thus can organize functions dynamically adapting to users requirement. Furthermore, we introduce the organizational semiotics methods to capture and analyze users requirement through ontology chart and norms. Based on these results, the structure of users requirement pattern (URP) is established as the driven factor of our system. Our research makes a contribution to design architecture of healthcare system which can adapt to the changing medical environment.
Resumo:
Given the decision to include small-scale sinks projects implemented by low-income communities in the clean development mechanism of the Kyoto Protocol, the paper explores some of the basic governance conditions that such carbon forestry projects will have to meet if they are to be successfully put in practice. To date there are no validated small-scale sinks projects and investors have shown little interest in financing such projects, possibly to due to the risks and uncertainties associated with sinks projects. Some suggest however, that carbon has the potential to become a serious commodity on the world market, thus governance over ownership, rights and responsibilities merit discussion. Drawing on the interdisciplinary development, as well as from the literature on livelihoods and democratic decentralization in forestry, the paper explores how to adapt forest carbon projects to the realities encountered in the local context. It also highlights the importance of capitalizing on synergies with other rural development strategies, ensuring stakeholder participation by working with accountable, representative local organizations, and creating flexible and adaptive project designs.
Resumo:
The subgrid-scale spatial variability in cloud water content can be described by a parameter f called the fractional standard deviation. This is equal to the standard deviation of the cloud water content divided by the mean. This parameter is an input to schemes that calculate the impact of subgrid-scale cloud inhomogeneity on gridbox-mean radiative fluxes and microphysical process rates. A new regime-dependent parametrization of the spatial variability of cloud water content is derived from CloudSat observations of ice clouds. In addition to the dependencies on horizontal and vertical resolution and cloud fraction included in previous parametrizations, the new parametrization includes an explicit dependence on cloud type. The new parametrization is then implemented in the Global Atmosphere 6 (GA6) configuration of the Met Office Unified Model and used to model the effects of subgrid variability of both ice and liquid water content on radiative fluxes and autoconversion and accretion rates in three 20-year atmosphere-only climate simulations. These simulations show the impact of the new regime-dependent parametrization on diagnostic radiation calculations, interactive radiation calculations and both interactive radiation calculations and in a new warm microphysics scheme. The control simulation uses a globally constant f value of 0.75 to model the effect of cloud water content variability on radiative fluxes. The use of the new regime-dependent parametrization in the model results in a global mean which is higher than the control's fixed value and a global distribution of f which is closer to CloudSat observations. When the new regime-dependent parametrization is used in radiative transfer calculations only, the magnitudes of short-wave and long-wave top of atmosphere cloud radiative forcing are reduced, increasing the existing global mean biases in the control. When also applied in a new warm microphysics scheme, the short-wave global mean bias is reduced.
Resumo:
This paper investigates the effect on balance of a number of Schur product-type localization schemes which have been designed with the primary function of reducing spurious far-field correlations in forecast error statistics. The localization schemes studied comprise a non-adaptive scheme (where the moderation matrix is decomposed in a spectral basis), and two adaptive schemes, namely a simplified version of SENCORP (Smoothed ENsemble COrrelations Raised to a Power) and ECO-RAP (Ensemble COrrelations Raised to A Power). The paper shows, we believe for the first time, how the degree of balance (geostrophic and hydrostatic) implied by the error covariance matrices localized by these schemes can be diagnosed. Here it is considered that an effective localization scheme is one that reduces spurious correlations adequately but also minimizes disruption of balance (where the 'correct' degree of balance or imbalance is assumed to be possessed by the unlocalized ensemble). By varying free parameters that describe each scheme (e.g. the degree of truncation in the schemes that use the spectral basis, the 'order' of each scheme, and the degree of ensemble smoothing), it is found that a particular configuration of the ECO-RAP scheme is best suited to the convective-scale system studied. According to our diagnostics this ECO-RAP configuration still weakens geostrophic and hydrostatic balance, but overall this is less so than for other schemes.
Resumo:
Adaptive governance is the use of novel approaches within policy to support experimentation and learning. Social learning reflects the engagement of interdependent stakeholders within this learning. Much attention has focused on these concepts as a solution for resilience in governing institutions in an uncertain climate; resilience representing the ability of a system to absorb shock and to retain its function and form through reorganisation. However, there are still many questions to how these concepts enable resilience, particularly in vulnerable, developing contexts. A case study from Uganda presents how these concepts promote resilient livelihood outcomes among rural subsistence farmers within a decentralised governing framework. This approach has the potential to highlight the dynamics and characteristics of a governance system which may manage change. The paper draws from the enabling characteristics of adaptive governance, including lower scale dynamics of bonding and bridging ties and strong leadership. Central to these processes were learning platforms promoting knowledge transfer leading to improved self-efficacy, innovation and livelihood skills. However even though aspects of adaptive governance were identified as contributing to resilience in livelihoods, some barriers were identified. Reflexivity and multi-stakeholder collaboration were evident in governing institutions; however, limited self-organisation and vertical communication demonstrated few opportunities for shifts in governance, which was severely challenged by inequity, politicisation and elite capture. The paper concludes by outlining implications for climate adaptation policy through promoting the importance of mainstreaming adaptation alongside existing policy trajectories; highlighting the significance of collaborative spaces for stakeholders and the tackling of inequality and corruption.
Resumo:
This study examines when incremental change is likely to trigger discontinuous change, using the lens of complex adaptive systems theory. Going beyond the simulations and case studies through which complex adaptive systems have been approached so far, we study the relationship between incremental organizational reconfigurations and discontinuous organizational restructurings using a large-scale database of U.S. Fortune 50 industrial corporations. We develop two types of escalation process in organizations: accumulation and perturbation. Under ordinary conditions, it is perturbation rather than the accumulation that is more likely to trigger subsequent discontinuous change. Consistent with complex adaptive systems theory, organizations are more sensitive to both accumulation and perturbation in conditions of heightened disequilibrium. Contrary to expectations, highly interconnected organizations are not more liable to discontinuous change. We conclude with implications for further research, especially the need to attend to the potential role of managerial design and coping when transferring complex adaptive systems theory from natural systems to organizational systems.
Resumo:
Background Appropriately conducted adaptive designs (ADs) offer many potential advantages over conventional trials. They make better use of accruing data, potentially saving time, trial participants, and limited resources compared to conventional, fixed sample size designs. However, one can argue that ADs are not implemented as often as they should be, particularly in publicly funded confirmatory trials. This study explored barriers, concerns, and potential facilitators to the appropriate use of ADs in confirmatory trials among key stakeholders. Methods We conducted three cross-sectional, online parallel surveys between November 2014 and January 2015. The surveys were based upon findings drawn from in-depth interviews of key research stakeholders, predominantly in the UK, and targeted Clinical Trials Units (CTUs), public funders, and private sector organisations. Response rates were as follows: 30(55 %) UK CTUs, 17(68 %) private sector, and 86(41 %) public funders. A Rating Scale Model was used to rank barriers and concerns in order of perceived importance for prioritisation. Results Top-ranked barriers included the lack of bridge funding accessible to UK CTUs to support the design of ADs, limited practical implementation knowledge, preference for traditional mainstream designs, difficulties in marketing ADs to key stakeholders, time constraints to support ADs relative to competing priorities, lack of applied training, and insufficient access to case studies of undertaken ADs to facilitate practical learning and successful implementation. Associated practical complexities and inadequate data management infrastructure to support ADs were reported as more pronounced in the private sector. For funders of public research, the inadequate description of the rationale, scope, and decision-making criteria to guide the planned AD in grant proposals by researchers were all viewed as major obstacles. Conclusions There are still persistent and important perceptions of individual and organisational obstacles hampering the use of ADs in confirmatory trials research. Stakeholder perceptions about barriers are largely consistent across sectors, with a few exceptions that reflect differences in organisations funding structures, experiences and characterisation of study interventions. Most barriers appear connected to a lack of practical implementation knowledge and applied training, and limited access to case studies to facilitate practical learning. Keywords: Adaptive designs; flexible designs; barriers; surveys; confirmatory trials; Phase 3; clinical trials; early stopping; interim analyses
Resumo:
The landfall of Cyclone Catarina on the Brazilian coast in March 2004 became known as the first documented hurricane in the South Atlantic Ocean, promoting a new view oil how large-scale features can contribute to tropical transition. The aim of this paper is to put the large-scale circulation associated with Catarina`s transition in climate perspective. This is discussed in the light of a robust pattern of spatial correlations between thermodynamic and dynamic variables of importance for hurricane formation. A discussion on how transition mechanisms respond to the present-day circulation is presented. These associations help in understanding why Catarina was formed in a region previously thought to be hurricane-free. Catarina developed over a large-scale area of thermodynamically favourable air/sea temperature contrast. This aspect explains the paradox that such a rare system developed when the sea surface temperature was slightly below average. But, although thermodynamics played an important role, it is apparent that Catarina would not have formed without the key dynamic interplay triggered by a high latitude blocking. The blocking was associated with an extreme positive phase of the Southern Annular Mode (SAM) both hemispherically and locally, and the nearby area where Catarina developed is found to be more cyclonic during the positive phase of the SAM. A conceptual model is developed and a `South Atlantic index` is introduced as a useful diagnostic of potential conditions leading to tropical transition in the area, where large-scale indices indicate trends towards more favourable atmospheric conditions for tropical cyclone formation. Copyright (c) 2008 Royal Meteorological Society
Resumo:
In southern Bahia, Brazil, large land areas are used for the production of cocoa (Theobroma cacao), which is predominantly grown under the shade of native trees in an agroforestry system locally known as cabruca. As a dominant forest-like landscape element of the cocoa region, the cabrucas play an important role in the conservation of the region`s biodiversity. The purpose of this review is to provide the scientific basis for an action plan to reconcile cocoa production and biodiversity conservation in southern Bahia. The available research collectively highlights the diversity of responses of different species and biological groups to both the habitat quality of the cabrucas themselves and to the general characteristics of the landscape, such as the relative extent and spatial configuration of different vegetation types within the landscape mosaic. We identify factors that influence directly or indirectly the occurrence of native species in the cabrucas and the wider landscape of the cocoa region and develop recommendations for their conservation management. We show that the current scientific knowledge already provides a good basis for a biodiversity friendly management of the cocoa region of southern Bahia, although more work is needed to refine some management recommendations, especially on shade canopy composition and density, and verify their economic viability. The implementation of our recommendations should be accompanied by appropriate biological and socioeconomic monitoring and the findings should inform a broad program of adaptive management of the cabrucas and the wider cocoa landscape.
Resumo:
Objectives. A large-scale survey of doses to patients undergoing the most frequent radiological examinations was carried out in health services in Sao Paulo (347 radiological examinations per 1 000 inhabitants), the most populous Brazilian state. Methods. A postal dosimetric kit with thermoluminescence dosimeters was used to evaluate the entrance surface dose (ESD) to patients. A stratified sampling technique applied to the national health database furnished important data on the distribution of equipment and the annual number of examinations. Chest, head (skull and sinus), and spine (cervical, thoracic, and lumbar) examinations were included in the trial. A total of 83 rooms and 868 patients were included, and 1 415 values of ESD were measured. Results. The data show large coefficients of variation in tube charge, giving rise to large variations in ESD values. Also, a series of high ESD values associated with unnecessary localizing fluoroscopy were detected. Diagnostic reference levels were determined, based on the 75th percentile (third quartile) of the ESD distributions. For adult patients, the diagnostic reference levels achieved are very similar to those obtained in international surveys. However, the situation is different for pediatric patients: the ESD values found in this survey are twice as large as the international recommendations for chest radiographs of children. Conclusions. Despite the reduced number of ESD values and rooms for the pediatric patient group, it is recommended that practices in chest examinations be revised and that specific national reference doses and image quality be established after a broader survey is carried out.