993 resultados para Adaptive Measurement
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
A cell by cell anisotropic adaptive mesh Arbitrary Lagrangian Eulerian (ALE) method for the solution of the Euler equations is described. An efficient approach to equipotential mesh relaxation on anisotropically refined meshes is developed. Results for two test problems are presented.
Resumo:
The shallow water equations are solved using a mesh of polygons on the sphere, which adapts infrequently to the predicted future solution. Infrequent mesh adaptation reduces the cost of adaptation and load-balancing and will thus allow for more accurate mapping on adaptation. We simulate the growth of a barotropically unstable jet adapting the mesh every 12 h. Using an adaptation criterion based largely on the gradient of the vorticity leads to a mesh with around 20 per cent of the cells of a uniform mesh that gives equivalent results. This is a similar proportion to previous studies of the same test case with mesh adaptation every 1–20 min. The prediction of the mesh density involves solving the shallow water equations on a coarse mesh in advance of the locally refined mesh in order to estimate where features requiring higher resolution will grow, decay or move to. The adaptation criterion consists of two parts: that resolved on the coarse mesh, and that which is not resolved and so is passively advected on the coarse mesh. This combination leads to a balance between resolving features controlled by the large-scale dynamics and maintaining fine-scale features.
Resumo:
SMPS and DMS500 analysers were used to measure particulate size distributions in the exhaust of a fully annular aero gas turbine engine at two operating conditions to compare and analyse sources of discrepancy. A number of different dilution ratio values were utilised for the comparative analysis, and a Dekati hot diluter operating at a temperature of 623°K was also utilised to remove volatile PM prior to measurements being made. Additional work focused on observing the effect of varying the sample line temperatures to ascertain the impact. Explanations are offered for most of the trends observed, although a new, repeatable event identified in the range from 417°K to 423°K – where there was a three order of magnitude increase in the nucleation mode of the sample – requires further study.
Resumo:
This contribution closes this special issue of Hydrology and Earth System Sciences concerning the assessment of nitrogen dynamics in catchments across Europe within a semi-distributed Integrated Nitrogen model for multiple source assessment in Catchments (INCA). New developments in the understanding of the factors and processes determining the concentrations and loads of nitrogen are outlined. The ability of the INCA model to simulate the hydrological and nitrogen dynamics of different European ecosystems is assessed and the results of the first scenario analyses investigating the impacts of deposition, climatic and land-use change on the nitrogen dynamics are summarised. Consideration is given as to how well the model has performed as a generic too] for describing the nitrogen dynamics of European ecosystems across Arctic, Maritime. Continental and Mediterranean climates, its role in new research initiatives and future research requirements.
Resumo:
Ecological risk assessments must increasingly consider the effects of chemical mixtures on the environment as anthropogenic pollution continues to grow in complexity. Yet testing every possible mixture combination is impractical and unfeasible; thus, there is an urgent need for models that can accurately predict mixture toxicity from single-compound data. Currently, two models are frequently used to predict mixture toxicity from single-compound data: Concentration addition and independent action (IA). The accuracy of the predictions generated by these models is currently debated and needs to be resolved before their use in risk assessments can be fully justified. The present study addresses this issue by determining whether the IA model adequately described the toxicity of binary mixtures of five pesticides and other environmental contaminants (cadmium, chlorpyrifos, diuron, nickel, and prochloraz) each with dissimilar modes of action on the reproduction of the nematode Caenorhabditis elegans. In three out of 10 cases, the IA model failed to describe mixture toxicity adequately with significant or antagonism being observed. In a further three cases, there was an indication of synergy, antagonism, and effect-level-dependent deviations, respectively, but these were not statistically significant. The extent of the significant deviations that were found varied, but all were such that the predicted percentage effect seen on reproductive output would have been wrong by 18 to 35% (i.e., the effect concentration expected to cause a 50% effect led to an 85% effect). The presence of such a high number and variety of deviations has important implications for the use of existing mixture toxicity models for risk assessments, especially where all or part of the deviation is synergistic.
Resumo:
This contribution closes this special issue of Hydrology and Earth System Sciences concerning the assessment of nitrogen dynamics in catchments across Europe within a semi-distributed Integrated Nitrogen model for multiple source assessment in Catchments (INCA). New developments in the understanding of the factors and processes determining the concentrations and loads of nitrogen are outlined. The ability of the INCA model to simulate the hydrological and nitrogen dynamics of different European ecosystems is assessed and the results of the first scenario analyses investigating the impacts of deposition, climatic and land-use change on the nitrogen dynamics are summarised. Consideration is given as to how well the model has performed as a generic too] for describing the nitrogen dynamics of European ecosystems across Arctic, Maritime. Continental and Mediterranean climates, its role in new research initiatives and future research requirements.
Resumo:
The Iowa gambling task (IGT) is one of the most influential behavioral paradigms in reward-related decision making and has been, most notably, associated with ventromedial prefrontal cortex function. However, performance in the IGT relies on a complex set of cognitive subprocesses, in particular integrating information about the outcome of choices into a continuously updated decision strategy under ambiguous conditions. The complexity of the task has made it difficult for neuroimaging studies to disentangle the underlying neurocognitive processes. In this study, we used functional magnetic resonance imaging in combination with a novel adaptation of the task, which allowed us to examine separately activation associated with the moment of decision or the evaluation of decision outcomes. Importantly, using whole-brain regression analyses with individual performance, in combination with the choice/outcome history of individual subjects, we aimed to identify the neural overlap between areas that are involved in the evaluation of outcomes and in the progressive discrimination of the relative value of available choice options, thus mapping the two fundamental cognitive processes that lead to adaptive decision making. We show that activation in right ventromedial and dorsolateral prefrontal cortex was predictive of adaptive performance, in both discriminating disadvantageous from advantageous decisions and confirming negative decision outcomes. We propose that these two prefrontal areas mediate shifting away from disadvantageous choices through their sensitivity to accumulating negative outcomes. These findings provide functional evidence of the underlying processes by which these prefrontal subregions drive adaptive choice in the task, namely through contingency-sensitive outcome evaluation.
Resumo:
In this article, we examine the case of a system that cooperates with a “direct” user to plan an activity that some “indirect” user, not interacting with the system, should perform. The specific application we consider is the prescription of drugs. In this case, the direct user is the prescriber and the indirect user is the person who is responsible for performing the therapy. Relevant characteristics of the two users are represented in two user models. Explanation strategies are represented in planning operators whose preconditions encode the cognitive state of the indirect user; this allows tailoring the message to the indirect user's characteristics. Expansion of optional subgoals and selection among candidate operators is made by applying decision criteria represented as metarules, that negotiate between direct and indirect users' views also taking into account the context where explanation is provided. After the message has been generated, the direct user may ask to add or remove some items, or change the message style. The system defends the indirect user's needs as far as possible by mentioning the rationale behind the generated message. If needed, the plan is repaired and the direct user model is revised accordingly, so that the system learns progressively to generate messages suited to the preferences of people with whom it interacts.