17 resultados para PROGRAMMING APPROACH
em CentAUR: Central Archive University of Reading - UK
Resumo:
Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.
Resumo:
Pair Programming is a technique from the software development method eXtreme Programming (XP) whereby two programmers work closely together to develop a piece of software. A similar approach has been used to develop a set of Assessment Learning Objects (ALO). Three members of academic staff have developed a set of ALOs for a total of three different modules (two with overlapping content). In each case a pair programming approach was taken to the development of the ALO. In addition to demonstrating the efficiency of this approach in terms of staff time spent developing the ALOs, a statistical analysis of the outcomes for students who made use of the ALOs is used to demonstrate the effectiveness of the ALOs produced via this method.
Resumo:
The member countries of the World Health Organization have endorsed its Global Strategy on Diet, Physical Activity, and Health. We assess the potential consumption impacts of these norms in the United States, France, and the United Kingdom using a mathematical programming approach. We find that adherence would involve large reductions in the consumption of fats and oils accompanying large rises in the consumption of fruits, vegetables, and cereal. Further, in the United Kingdom and the United States, but not France, sugar intakes would have to shrink considerably. Focusing on sub-populations within each country, we find that the least educated, not necessarily the poorest, would have to bear the highest burden of adjustment.
Resumo:
Promotion of adherence to healthy-eating norms has become an important element of nutrition policy in the United States and other developed countries. We assess the potential consumption impacts of adherence to a set of recommended dietary norms in the United States using a mathematical programming approach. We find that adherence to recommended dietary norms would involve significant changes in diets, with large reductions in the consumption of fats and oils along with large increases in the consumption of fruits, vegetables, and cereals. Compliance with norms recommended by the World Health Organization for energy derived from sugar would involve sharp reductions in sugar intakes. We also analyze how dietary adjustments required vary across demographic groups. Most socio-demographic characteristics appear to have relatively little influence on the pattern of adjustment required to comply with norms, Income levels have little effect on required dietary adjustments. Education is the only characteristic to have a significant influence on the magnitude of adjustments required. The least educated rather than the poorest have to bear the highest burden of adjustment. Out- analysis suggests that fiscal measures like nutrient-based taxes may not be as regressive as commonly believed. Dissemination of healthy-eating norms to the less educated will be a key challenge for nutrition policy.
Resumo:
Promotion of adherence to healthy-eating norms has become an important element of nutrition policy in the United States and other developed countries. We assess the potential consumption impacts of adherence to a set of recommended dietary norms in the United States using a mathematical programming approach. We find that adherence to recommended dietary norms would involve significant changes in diets, with large reductions in the consumption of fats and oils along with large increases in the consumption of fruits, vegetables, and cereals. Compliance with norms recommended by the World Health Organization for energy derived from sugar would involve sharp reductions in sugar intakes. We also analyze how dietary adjustments required vary across demographic groups. Most socio-demographic characteristics appear to have relatively little influence on the pattern of adjustment required to comply with norms, Income levels have little effect on required dietary adjustments. Education is the only characteristic to have a significant influence on the magnitude of adjustments required. The least educated rather than the poorest have to bear the highest burden of adjustment. Out- analysis suggests that fiscal measures like nutrient-based taxes may not be as regressive as commonly believed. Dissemination of healthy-eating norms to the less educated will be a key challenge for nutrition policy.
Resumo:
The member countries of the World Health Organization (WHO) have recently endorsed its global strategy on diet, physical activity and health. The strategy emphasises the need to limit the consumption of saturated fats and trans-fatty acids, salt and sugars, and to increase consumption of fruits and vegetables in order to combat the growing burden of non-communicable diseases. This paper attempts a broad quantitative assessment of the consumption impacts of these norms in OECD countries using a mathematical programming approach. We find that adherence to the WHO norms would involve a significant decrease in the consumption of vegetable oils (30%), dairy products (28%), sugar (24%), animal fats (30%) and meat (pig meat, 13.5%, mutton and goat 14.5%) and a significant increase in the human consumption of cereals (31%), fruits (25%) and vegetables (21%). (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The Team Formation problem (TFP) has become a well-known problem in the OR literature over the last few years. In this problem, the allocation of multiple individuals that match a required set of skills as a group must be chosen to maximise one or several social positive attributes. Speci�cally, the aim of the current research is two-fold. First, two new dimensions of the TFP are added by considering multiple projects and fractions of people's dedication. This new problem is named the Multiple Team Formation Problem (MTFP). Second, an optimization model consisting in a quadratic objective function, linear constraints and integer variables is proposed for the problem. The optimization model is solved by three algorithms: a Constraint Programming approach provided by a commercial solver, a Local Search heuristic and a Variable Neighbourhood Search metaheuristic. These three algorithms constitute the first attempt to solve the MTFP, being a variable neighbourhood local search metaheuristic the most effi�cient in almost all cases. Applications of this problem commonly appear in real-life situations, particularly with the current and ongoing development of social network analysis. Therefore, this work opens multiple paths for future research.
Resumo:
BACKGROUND: Sex differences are present in many neuropsychiatric conditions that affect emotion and approach-avoidance behavior. One potential mechanism underlying such observations is testosterone in early development. Although much is known about the effects of testosterone in adolescence and adulthood, little is known in humans about how testosterone in fetal development influences later neural sensitivity to valenced facial cues and approach-avoidance behavioral tendencies. METHODS: With functional magnetic resonance imaging we scanned 25 8-11-year-old children while viewing happy, fear, neutral, or scrambled faces. Fetal testosterone (FT) was measured via amniotic fluid sampled between 13 and 20 weeks gestation. Behavioral approach-avoidance tendencies were measured via parental report on the Sensitivity to Punishment and Sensitivity to Rewards questionnaire. RESULTS: Increasing FT predicted enhanced selectivity for positive compared with negatively valenced facial cues in reward-related regions such as caudate, putamen, and nucleus accumbens but not the amygdala. Statistical mediation analyses showed that increasing FT predicts increased behavioral approach tendencies by biasing caudate, putamen, and nucleus accumbens but not amygdala to be more responsive to positive compared with negatively valenced cues. In contrast, FT was not predictive of behavioral avoidance tendencies, either through direct or neurally mediated paths. CONCLUSIONS: This work suggests that testosterone in humans acts as a fetal programming mechanism on the reward system and influences behavioral approach tendencies later in life. As a mechanism influencing atypical development, FT might be important across a range of neuropsychiatric conditions that asymmetrically affect the sexes, the reward system, emotion processing, and approach behavior.
Resumo:
Competency management is a very important part of a well-functioning organisation. Unfortunately competency descriptions are not uniformly specified nor defined across borders: National, sectorial or organisational, leading to an opaque competency description market with a multitude of competency frameworks and competency benchmarks. An ontology is a formalised description of a domain, which enables automated reasoning engines to be built which by utilising the interrelations between entities can make “intelligent” choices in different situations within the domain. Introducing formalised competency ontologies automated tools, such as skill gap analysis, training suggestion generation, job search and recruitment, can be developed, which compare and contrast different competency descriptions on the semantic level. The major problem with defining a common formalised ontology for competencies is that there are so many viewpoints of competencies and competency frameworks. Work within the TRACE project has focused on finding common trends within different competency frameworks in order to allow an intermediate competency description to be made, which other frameworks can reference. This research has shown that competencies can be divided up into “knowledge”, “skills” and what we call “others”. An ontology has been created based on this with a simple structure of different “kinds” of “knowledges” and “skills” using semantic interrelations to define the basic semantic structure of the ontology. A prototype tool for analysing a skill gap analysis has been developed. Personal profiles can be produced using the tool and a skill gap analysis is performed on a desired competency profile by using an ontologically based inference engine, which is able to list closest fit and possible proficiency gaps
Resumo:
In Central Brazil, the long-term sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, ‘asset value of cattle (representing cattle ownership)' and ‘present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics, and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple ‘no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil.
Resumo:
In Central Brazil, the long-term, sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from. degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, 'asset value of cattle (representing cattle ownership and 'present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring caring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics,and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple 'no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a new method for the inclusion of nonlinear demand and supply relationships within a linear programming model. An existing method for this purpose is described first and its shortcomings are pointed out before showing how the new approach overcomes those difficulties and how it provides a more accurate and 'smooth' (rather than a kinked) approximation of the nonlinear functions as well as dealing with equilibrium under perfect competition instead of handling just the monopolistic situation. The workings of the proposed method are illustrated by extending a previously available sectoral model for the UK agriculture.
Resumo:
Milk supply from Mexican dairy farms does not meet demand and small-scale farms can contribute toward closing the gap. Two multi-criteria programming techniques, goal programming and compromise programming, were used in a study of small-scale dairy farms in central Mexico. To build the goal and compromise programming models, 4 ordinary linear programming models were also developed, which had objective functions to maximize metabolizable energy for milk production, to maximize margin of income over feed costs, to maximize metabolizable protein for milk production, and to minimize purchased feedstuffs. Neither multicriteria approach was significantly better than the other; however, by applying both models it was possible to perform a more comprehensive analysis of these small-scale dairy systems. The multi-criteria programming models affirm findings from previous work and suggest that a forage strategy based on alfalfa, rye-grass, and corn silage would meet nutrient requirements of the herd. Both models suggested that there is an economic advantage in rescheduling the calving season to the second and third calendar quarters to better synchronize higher demand for nutrients with the period of high forage availability.
Resumo:
This paper describes a technique that can be used as part of a simple and practical agile method for requirements engineering. It is based on disciplined goal-responsibility modelling but eschews formality in favour of a set of practicality objectives. The technique can be used together with Agile Programming to develop software in internet time. We illustrate the technique and introduce lazy refinement, responsibility composition and context sketching. Goal sketching has been used in a number of real-world development.
Resumo:
Body Sensor Networks (BSNs) have been recently introduced for the remote monitoring of human activities in a broad range of application domains, such as health care, emergency management, fitness and behaviour surveillance. BSNs can be deployed in a community of people and can generate large amounts of contextual data that require a scalable approach for storage, processing and analysis. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of data streams generated in BSNs. This paper proposes BodyCloud, a SaaS approach for community BSNs that supports the development and deployment of Cloud-assisted BSN applications. BodyCloud is a multi-tier application-level architecture that integrates a Cloud computing platform and BSN data streams middleware. BodyCloud provides programming abstractions that allow the rapid development of community BSN applications. This work describes the general architecture of the proposed approach and presents a case study for the real-time monitoring and analysis of cardiac data streams of many individuals.