986 resultados para adaptive effectiveness
Resumo:
Pair Programming is a technique from the software development method eXtreme Programming (XP) whereby two programmers work closely together to develop a piece of software. A similar approach has been used to develop a set of Assessment Learning Objects (ALO). Three members of academic staff have developed a set of ALOs for a total of three different modules (two with overlapping content). In each case a pair programming approach was taken to the development of the ALO. In addition to demonstrating the efficiency of this approach in terms of staff time spent developing the ALOs, a statistical analysis of the outcomes for students who made use of the ALOs is used to demonstrate the effectiveness of the ALOs produced via this method.
Resumo:
In this paper a cell by cell anisotropic adaptive mesh technique is added to an existing staggered mesh Lagrange plus remap finite element ALE code for the solution of the Euler equations. The quadrilateral finite elements may be subdivided isotropically or anisotropically and a hierarchical data structure is employed. An efficient computational method is proposed, which only solves on the finest level of resolution that exists for each part of the domain with disjoint or hanging nodes being used at resolution transitions. The Lagrangian, equipotential mesh relaxation and advection (solution remapping) steps are generalised so that they may be applied on the dynamic mesh. It is shown that for a radial Sod problem and a two-dimensional Riemann problem the anisotropic adaptive mesh method runs over eight times faster.
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
A cell by cell anisotropic adaptive mesh Arbitrary Lagrangian Eulerian (ALE) method for the solution of the Euler equations is described. An efficient approach to equipotential mesh relaxation on anisotropically refined meshes is developed. Results for two test problems are presented.
Resumo:
Alternative meshes of the sphere and adaptive mesh refinement could be immensely beneficial for weather and climate forecasts, but it is not clear how mesh refinement should be achieved. A finite-volume model that solves the shallow-water equations on any mesh of the surface of the sphere is presented. The accuracy and cost effectiveness of four quasi-uniform meshes of the sphere are compared: a cubed sphere, reduced latitude–longitude, hexagonal–icosahedral, and triangular–icosahedral. On some standard shallow-water tests, the hexagonal–icosahedral mesh performs best and the reduced latitude–longitude mesh performs well only when the flow is aligned with the mesh. The inclusion of a refined mesh over a disc-shaped region is achieved using either gradual Delaunay, gradual Voronoi, or abrupt 2:1 block-structured refinement. These refined regions can actually degrade global accuracy, presumably because of changes in wave dispersion where the mesh is highly nonuniform. However, using gradual refinement to resolve a mountain in an otherwise coarse mesh can improve accuracy for the same cost. The model prognostic variables are height and momentum collocated at cell centers, and (to remove grid-scale oscillations of the A grid) the mass flux between cells is advanced from the old momentum using the momentum equation. Quadratic and upwind biased cubic differencing methods are used as explicit corrections to a fast implicit solution that uses linear differencing.
Resumo:
The shallow water equations are solved using a mesh of polygons on the sphere, which adapts infrequently to the predicted future solution. Infrequent mesh adaptation reduces the cost of adaptation and load-balancing and will thus allow for more accurate mapping on adaptation. We simulate the growth of a barotropically unstable jet adapting the mesh every 12 h. Using an adaptation criterion based largely on the gradient of the vorticity leads to a mesh with around 20 per cent of the cells of a uniform mesh that gives equivalent results. This is a similar proportion to previous studies of the same test case with mesh adaptation every 1–20 min. The prediction of the mesh density involves solving the shallow water equations on a coarse mesh in advance of the locally refined mesh in order to estimate where features requiring higher resolution will grow, decay or move to. The adaptation criterion consists of two parts: that resolved on the coarse mesh, and that which is not resolved and so is passively advected on the coarse mesh. This combination leads to a balance between resolving features controlled by the large-scale dynamics and maintaining fine-scale features.
Resumo:
The morphology of Acheulean handaxes continues to be a subject of debate amongst Lower Palaeolithic archaeologists, with some arguing that many handaxes are over-engineered for a subsistence function alone. This study aims to provide an empirical foundation for these debates by testing the relationship between a range of morphological variables, including symmetry, and the effectiveness of handaxes for butchery. Sixty handaxes were used to butcher 30 fallow deer by both a professional and a non-professional butcher. Regression analysis on the resultant data set indicates that while frontal symmetry may explain a small amount of variance in the effectiveness of handaxes for butchery, a large percentage of variance remains unexplained by symmetry or any of the other morphological variables under consideration.
Resumo:
The systems used for the procurement of buildings are organizational systems. They involve people in a series of strategic decisions, and a pattern of roles, responsibilities and relationships that combine to form the organizational structure of the project. To ensure effectiveness of the building team, this organizational structure needs to be contingent upon the environment within which the construction project takes place. In addition, a changing environment means that the organizational structure within a project needs to be responsive, and dynamic. These needs are often not satisfied in the construction industry, due to the lack of analytical tools with which to analyse the environment and to design appropriate temporary organizations. This paper presents two techniques. First is the technique of "Environmental Complexity Analysis", which identifies the key variables in the environment of the construction project. These are classified as Financial, Legal, Technological, Aesthetic and Policy. It is proposed that their identification will set the parameters within which the project has to be managed. This provides a basis for the project managers to define the relevant set of decision points that will be required for the project. The Environmental Complexity Analysis also identifies the project's requirements for control systems concerning Budget, Contractual, Functional, Quality and Time control. The process of environmental scanning needs to be done at regular points during the procurement process to ensure that the organizational structure is adaptive to the changing environment. The second technique introduced is the technique of "3R analysis", being a graphical technique for describing and modelling Roles, Responsibilities and Relationships. A list of steps is introduced that explains the procedure recommended for setting up a flexible organizational structure that is responsive to the environment of the project. This is by contrast with the current trend towards predetermined procurement paths that may not always be in the best interests of the client.
Resumo:
The Iowa gambling task (IGT) is one of the most influential behavioral paradigms in reward-related decision making and has been, most notably, associated with ventromedial prefrontal cortex function. However, performance in the IGT relies on a complex set of cognitive subprocesses, in particular integrating information about the outcome of choices into a continuously updated decision strategy under ambiguous conditions. The complexity of the task has made it difficult for neuroimaging studies to disentangle the underlying neurocognitive processes. In this study, we used functional magnetic resonance imaging in combination with a novel adaptation of the task, which allowed us to examine separately activation associated with the moment of decision or the evaluation of decision outcomes. Importantly, using whole-brain regression analyses with individual performance, in combination with the choice/outcome history of individual subjects, we aimed to identify the neural overlap between areas that are involved in the evaluation of outcomes and in the progressive discrimination of the relative value of available choice options, thus mapping the two fundamental cognitive processes that lead to adaptive decision making. We show that activation in right ventromedial and dorsolateral prefrontal cortex was predictive of adaptive performance, in both discriminating disadvantageous from advantageous decisions and confirming negative decision outcomes. We propose that these two prefrontal areas mediate shifting away from disadvantageous choices through their sensitivity to accumulating negative outcomes. These findings provide functional evidence of the underlying processes by which these prefrontal subregions drive adaptive choice in the task, namely through contingency-sensitive outcome evaluation.
Resumo:
Three experiments investigated the effectiveness of presenting procedural information through different media and their combinations. Experiment 1 examined the effectiveness of text, line drawings, text and line drawings, video. and video stills for learning a first aid task. The results showed an advantage of text and line drawings and of the video presentation over the other three conditions for both bandaging performance and answering questions about the task. Experiment 2 showed that the beneficial effect of the combination of text and pictures could not be accounted for simply in terms of a dual coding explanation. Rather, the effectiveness of the media and their combinations was influenced by the extent to which they conveyed action information. Finally, Experiment 3 showed no evidence of a contiguity effect: text and pictures were as effective when presented together on the same screen as when they were presented separately. Copyright © 2000 John Wiley & Sons, Ltd.
Resumo:
In this article, we examine the case of a system that cooperates with a “direct” user to plan an activity that some “indirect” user, not interacting with the system, should perform. The specific application we consider is the prescription of drugs. In this case, the direct user is the prescriber and the indirect user is the person who is responsible for performing the therapy. Relevant characteristics of the two users are represented in two user models. Explanation strategies are represented in planning operators whose preconditions encode the cognitive state of the indirect user; this allows tailoring the message to the indirect user's characteristics. Expansion of optional subgoals and selection among candidate operators is made by applying decision criteria represented as metarules, that negotiate between direct and indirect users' views also taking into account the context where explanation is provided. After the message has been generated, the direct user may ask to add or remove some items, or change the message style. The system defends the indirect user's needs as far as possible by mentioning the rationale behind the generated message. If needed, the plan is repaired and the direct user model is revised accordingly, so that the system learns progressively to generate messages suited to the preferences of people with whom it interacts.