991 resultados para programming models
Resumo:
G-Rex is light-weight Java middleware that allows scientific applications deployed on remote computer systems to be launched and controlled as if they are running on the user's own computer. G-Rex is particularly suited to ocean and climate modelling applications because output from the model is transferred back to the user while the run is in progress, which prevents the accumulation of large amounts of data on the remote cluster. The G-Rex server is a RESTful Web application that runs inside a servlet container on the remote system, and the client component is a Java command line program that can easily be incorporated into existing scientific work-flow scripts. The NEMO and POLCOMS ocean models have been deployed as G-Rex services in the NERC Cluster Grid, and G-Rex is the core grid middleware in the GCEP and GCOMS e-science projects.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
The aim of this review article is to provide an overview of the role of pigs as a biomedical model for humans. The usefulness and limitations of porcine models have been discussed in terms of metabolic, cardiovascular, digestive and bone diseases in humans. Domestic pigs and minipigs are the main categories of pigs used as biomedical models. One drawback of minipigs is that they are in short supply and expensive compared with domestic pigs, which in contrast cost more to house, feed and medicate. Different porcine breeds show different responses to the induction of specific diseases. For example, ossabaw minipigs provide a better model than Yucatan for the metabolic syndrome as they exhibit obesity, insulin resistance and hypertension, all of which are absent in the Yucatan. Similar metabolic/physiological differences exist between domestic breeds (e.g. Meishan v. Pietrain). The modern commercial (e.g. Large White) domestic pig has been the preferred model for developmental programming due to the 2- to 3-fold variation in body weight among littermates providing a natural form of foetal growth retardation not observed in ancient (e.g. Meishan) domestic breeds. Pigs have been increasingly used to study chronic ischaemia, therapeutic angiogenesis, hypertrophic cardiomyopathy and abdominal aortic aneurysm as their coronary anatomy and physiology are similar to humans. Type 1 and II diabetes can be induced in swine using dietary regimes and/or administration of streptozotocin. Pigs are a good and extensively used model for specific nutritional studies as their protein and lipid metabolism is comparable with humans, although pigs are not as sensitive to protein restriction as rodents. Neonatal and weanling pigs have been used to examine the pathophysiology and prevention/treatment of microbial-associated diseases and immune system disorders. A porcine model mimicking various degrees of prematurity in infants receiving total parenteral nutrition has been established to investigate gut development, amino acid metabolism and non-alcoholic fatty liver disease. Endoscopic therapeutic methods for upper gastrointestinal tract bleeding are being developed. Bone remodelling cycle in pigs is histologically more similar to humans than that of rats or mice, and is used to examine the relationship between menopause and osteoporosis. Work has also been conducted on dental implants in pigs to consider loading; however with caution as porcine bone remodels slightly faster than human bone. We conclude that pigs are a valuable translational model to bridge the gap between classical rodent models and humans in developing new therapies to aid human health.
Resumo:
Integrated simulation models can be useful tools in farming system research. This chapter reviews three commonly used approaches, i.e. linear programming, system dynamics and agent-based models. Applications of each approach are presented and strengths and drawbacks discussed. We argue that, despite some challenges, mainly related to the integration of different approaches, model validation and the representation of human agents, integrated simulation models contribute important insights to the analysis of farming systems. They help unravelling the complex and dynamic interactions and feedbacks among bio-physical, socio-economic, and institutional components across scales and levels in farming systems. In addition, they can provide a platform for integrative research, and can support transdisciplinary research by functioning as learning platforms in participatory processes.
Resumo:
There is strong evidence from animal studies that prenatal stress has different effects on male and female offspring. In general, although not always, prenatal stress increases anxiety, depression and stress responses, both hypothalamic–pituitary–adrenal and cardiovascular, in female offspring rather than in male. Males are more likely to show learning and memory deficits. There have been few studies so far in humans which differentiate effects of prenatal stress on male and female psychopathology. Some studies support the animal models, but the evidence is inconsistent. The mediating mechanisms for any sex specific effects are little understood, but there is evidence that placental function can differ depending on the sex of the fetus. We suggest that there may be an evolutionary reason for any sex differences in the long term effects of prenatal stress. In a stressful environment it may be adaptive for females, who are more likely to stay in one place and look after children, to be more vigilant, alert to danger and thus show more stress responsiveness. This can give rise to a more anxious or depressed phenotype. With males it may be more adaptive to go out and explore new environments, compete with other males, and be more aggressive. For this it may help to be less responsive to external stressors. More research is needed into sex differences in the effects of prenatal stress in humans, to test these ideas.
Resumo:
The constrained compartmentalized knapsack problem can be seen as an extension of the constrained knapsack problem. However, the items are grouped into different classes so that the overall knapsack has to be divided into compartments, and each compartment is loaded with items from the same class. Moreover, building a compartment incurs a fixed cost and a fixed loss of the capacity in the original knapsack, and the compartments are lower and upper bounded. The objective is to maximize the total value of the items loaded in the overall knapsack minus the cost of the compartments. This problem has been formulated as an integer non-linear program, and in this paper, we reformulate the non-linear model as an integer linear master problem with a large number of variables. Some heuristics based on the solution of the restricted master problem are investigated. A new and more compact integer linear model is also presented, which can be solved by a branch-and-bound commercial solver that found most of the optimal solutions for the constrained compartmentalized knapsack problem. On the other hand, heuristics provide good solutions with low computational effort. (C) 2011 Elsevier BM. All rights reserved.
Resumo:
We investigate several two-dimensional guillotine cutting stock problems and their variants in which orthogonal rotations are allowed. We first present two dynamic programming based algorithms for the Rectangular Knapsack (RK) problem and its variants in which the patterns must be staged. The first algorithm solves the recurrence formula proposed by Beasley; the second algorithm - for staged patterns - also uses a recurrence formula. We show that if the items are not so small compared to the dimensions of the bin, then these algorithms require polynomial time. Using these algorithms we solved all instances of the RK problem found at the OR-LIBRARY, including one for which no optimal solution was known. We also consider the Two-dimensional Cutting Stock problem. We present a column generation based algorithm for this problem that uses the first algorithm above mentioned to generate the columns. We propose two strategies to tackle the residual instances. We also investigate a variant of this problem where the bins have different sizes. At last, we study the Two-dimensional Strip Packing problem. We also present a column generation based algorithm for this problem that uses the second algorithm above mentioned where staged patterns are imposed. In this case we solve instances for two-, three- and four-staged patterns. We report on some computational experiments with the various algorithms we propose in this paper. The results indicate that these algorithms seem to be suitable for solving real-world instances. We give a detailed description (a pseudo-code) of all the algorithms presented here, so that the reader may easily implement these algorithms. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
This article presents a well-known interior point method (IPM) used to solve problems of linear programming that appear as sub-problems in the solution of the long-term transmission network expansion planning problem. The linear programming problem appears when the transportation model is used, and when there is the intention to solve the planning problem using a constructive heuristic algorithm (CHA), ora branch-and-bound algorithm. This paper shows the application of the IPM in a CHA. A good performance of the IPM was obtained, and then it can be used as tool inside algorithm, used to solve the planning problem. Illustrative tests are shown, using electrical systems known in the specialized literature. (C) 2005 Elsevier B.V. All rights reserved.
Resumo:
The aim of this work was to present organizational models for optimizing the reduction of crop residue generated by the sugarcane culture. The first model consisted of the selection of varieties of sugarcane to be planted meeting the mill requirements and, at the same time, to minimize the quantity of residue produced. The second model discussed the use of residue to produce energy. This is related to the selection of variety and quantity to be planted, in order to meet the requirements of the mill, to reduce the quantity of residue, and to maximize as much as possible the energy production. The use of linear programming was proposed. The two models presented similar results in this study, and both may be used to define the varieties and areas to be cultivated. (C) 2001 Published by Elsevier B.V. Ltd.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The increase of computing power of the microcomputers has stimulated the building of direct manipulation interfaces that allow graphical representation of Linear Programming (LP) models. This work discusses the components of such a graphical interface as the basis for a system to assist users in the process of formulating LP problems. In essence, this work proposes a methodology which considers the modelling task as divided into three stages which are specification of the Data Model, the Conceptual Model and the LP Model. The necessity for using Artificial Intelligence techniques in the problem conceptualisation and to help the model formulation task is illustrated.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A combined methodology consisting of successive linear programming (SLP) and a simple genetic algorithm (SGA) solves the reactive planning problem. The problem is divided into operating and planning subproblems; the operating subproblem, which is a nonlinear, ill-conditioned and nonconvex problem, consists of determining the voltage control and the adjustment of reactive sources. The planning subproblem consists of obtaining the optimal reactive source expansion considering operational, economical and physical characteristics of the system. SLP solves the optimal reactive dispatch problem related to real variables, while SGA is used to determine the necessary adjustments of both the binary and discrete variables existing in the modelling problem. Once the set of candidate busbars has been defined, the program implemented gives the location and size of the reactive sources needed, if any, to maintain the operating and security constraints.
Resumo:
This paper describes a program for the automatic generation of code for Intel's 8051 microcontroller. The code is generated from a place-transition Petri net specification. Our goal is to minimize programming time. The code generated by our program has been observed to exactly match the net model. It has also been observed that no change is needed to be made to the generated code for its compilation to the target architecture. © 2011 IFAC.
Resumo:
This paper presents an interactive simulation environment for distance protection, developed with ATP and foreign models based on ANSI C. Files in COMTRADE format are possible to generate after ATP simulation. These files can be used to calibrate real relays. Also, the performance of relay algorithms with real oscillography events is possible to assess by using the ATP option for POSTPROCESS PLOT FILE (PPF). The main purpose of the work is to develop a tool to allow the analysis of diverse fault cases and to perform coordination studies, as well as, to allow the analysis of the relay's performance in the face of a real event. © 2011 IEEE.