942 resultados para Parallel programming model
Resumo:
Due to the ongoing trend towards increased product variety, fast-moving consumer goods such as food and beverages, pharmaceuticals, and chemicals are typically manufactured through so-called make-and-pack processes. These processes consist of a make stage, a pack stage, and intermediate storage facilities that decouple these two stages. In operations scheduling, complex technological constraints must be considered, e.g., non-identical parallel processing units, sequence-dependent changeovers, batch splitting, no-wait restrictions, material transfer times, minimum storage times, and finite storage capacity. The short-term scheduling problem is to compute a production schedule such that a given demand for products is fulfilled, all technological constraints are met, and the production makespan is minimised. A production schedule typically comprises 500–1500 operations. Due to the problem size and complexity of the technological constraints, the performance of known mixed-integer linear programming (MILP) formulations and heuristic approaches is often insufficient. We present a hybrid method consisting of three phases. First, the set of operations is divided into several subsets. Second, these subsets are iteratively scheduled using a generic and flexible MILP formulation. Third, a novel critical path-based improvement procedure is applied to the resulting schedule. We develop several strategies for the integration of the MILP model into this heuristic framework. Using these strategies, high-quality feasible solutions to large-scale instances can be obtained within reasonable CPU times using standard optimisation software. We have applied the proposed hybrid method to a set of industrial problem instances and found that the method outperforms state-of-the-art methods.
Resumo:
BACKGROUND: We developed a canine model of acute atopic dermatitis to evaluate the potential of compounds to treat pruritus and skin lesions induced in Dermatophagoides farinae (Df)-sensitized dogs. HYPOTHESIS/OBJECTIVES: The aim was to investigate the effectiveness of long-term recording activity monitors to assess pruritus induced by allergen challenges. ANIMALS: Thirty-two Df-sensitized laboratory dogs. METHODS: In two blinded crossover studies, 28 Df-sensitized dogs were challenged on 3 days with a Df slurry applied to clipped abdominal skin. Dogs were treated with a positive control (prednisolone 1 mg/kg once daily for 5 days, starting 1 day before challenge) or left untreated; all were fitted with activity monitors. To confirm pruritus, a parallel study with four dogs was conducted, filming the dogs before and during challenge and assessing the film for pruritic behaviour. RESULTS: The activity of dogs treated with prednisolone was significantly lower between 00.00 and 03.00 h and between 03.00 and 06.00 h compared with untreated dogs (repeated-measures ANCOVA; P < 0.0001). To determine whether the recorded night-time activity corresponded to pruritic manifestations, we compared activity monitor and video recordings of four dogs for two periods (16.30-20.30 and 24.00-03.00 h) before and during a Df challenge. The correlation between night-time activity monitor activity and observed pruritic behaviour was highly significant (test of correlation coefficient versus zero: r = 0.57, P < 0.0001). CONCLUSIONS AND CLINICAL IMPORTANCE: Determination of night-time activity with activity monitors after allergen challenge appears to be an objective and practical way to assess pruritus in this experimental model of canine atopic dermatitis.
Resumo:
The potential and adaptive flexibility of population dynamic P-systems (PDP) to study population dynamics suggests that they may be suitable for modelling complex fluvial ecosystems, characterized by a composition of dynamic habitats with many variables that interact simultaneously. Using as a model a reservoir occupied by the zebra mussel Dreissena polymorpha, we designed a computational model based on P systems to study the population dynamics of larvae, in order to evaluate management actions to control or eradicate this invasive species. The population dynamics of this species was simulated under different scenarios ranging from the absence of water flow change to a weekly variation with different flow rates, to the actual hydrodynamic situation of an intermediate flow rate. Our results show that PDP models can be very useful tools to model complex, partially desynchronized, processes that work in parallel. This allows the study of complex hydroecological processes such as the one presented, where reproductive cycles, temperature and water dynamics are involved in the desynchronization of the population dynamics both, within areas and among them. The results obtained may be useful in the management of other reservoirs with similar hydrodynamic situations in which the presence of this invasive species has been documented.
Resumo:
Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.
Resumo:
OBJECTIVE: Mechanical evaluation of a novel screw position used for repair in a type III distal phalanx fracture model and assessment of solar canal penetration (SCP). STUDY DESIGN: Experimental study. SAMPLE POPULATION: Disarticulated equine hooves (n = 24) and 24 isolated distal phalanges. METHODS: Hooves/distal phalanges cut in a sagittal plane were repaired with 1 of 2 different cortical screw placements in lag fashion. In group 1 (conventional screw placement), the screw was inserted halfway between the proximal border of the solar canal (SC) and the subchondral bone surface on a line parallel to the dorsal cortex, whereas in group 2, the screw was inserted more palmar/plantar, where a perpendicular line drawn from the group 1 position reached the palmar/plantar cortex. Construct strength was evaluated by 3-point bending to failure. SCP was assessed by CT imaging and macroscopically. RESULTS: Screws were significantly longer in group 2 and in forelimbs. Group 2 isolated distal phalanges had a significantly more rigid fixation compared with the conventional screw position (maximum point at failure 31%, bending stiffness 41% higher). Lumen reduction of the SC was observed in 13/52 specimens (all from group 2), of which 9 were forelimbs. CONCLUSIONS: More distal screw positioning compared with the conventionally recommended screw position for internal fixation of type III distal phalangeal fractures allows placement of a longer screw and renders a more rigid fracture fixation. The novel screw position, however, carries a higher risk of SCP
Resumo:
Upwelling along the western coast of Africa south of the equator may be partitioned into three major areas, each having its own dynamics and history: (1) the eastern equatorial region, comprising the Congo Fan and the area of Mid-Angola; (2) the Namibia upwelling system, extending from the Walvis Ridge to Lüderitz; and (3) the Cape Province region, where upwelling is subdued. The highest nutrient contents in thermocline waters are in the northern region, the lowest in the southern one. Wind effects are at a maximum near the southern end of the Namibia upwelling system, and maximum productivity occurs near Walvis Bay, where the product between upwelling rate and nutrient content of upwelled waters is at a maximum. In the Congo/Angola region, opal tends to follow organic carbon quite closely in the Quaternary record. However, organic carbon has a strong precessional component, while opal does not. Despite relatively low opal content, sediments off Angola show the same patterns as those off the Congo; thus, they are part of the same regime. The spectrum shows nonlinear interference patterns between high- and low-latitude forcing, presumably tied to thermocline fertility and wind. On Walvis Ridge, as in the Congo-Angola region, the organic matter record behaves normally; that is, supply is high during glacial periods. In contrast, interglacial periods are favorable for opal deposition. The pattern suggests reduction in silicate content of the thermocline during glacial periods. The reversed phase (opal abundant during interglacials) persists during the entire Pleistocene and can be demonstrated deep into the Pliocene, not just on Walvis Ridge but all the way to the Oranje River and off the Cape Province. From comparison with other regions, it appears that silicate is diminished in the global thermocline, on average, whenever winds become strong enough to substantially shorten the residence time of silicate in upper waters (Walvis Hypothesis, solving the Walvis Paradox of reversed phase in opal deposition). The central discovery during Leg 175 was the documentation of a late Pliocene opal maximum for the entire Namibia upwelling system (early Matuyama Diatom Maximum [MDM]). The maximum is centered on the period between the end of the Gauss Chron and the beginning of the Olduvai Chron. A rather sharp increase in both organic matter deposition and opal deposition occurs near 3 Ma in the middle of the Gauss Chron, in association with a series of major cooling steps. As concerns organic matter, high production persists at least to 1 Ma, when there are large changes in variability, heralding subsequent pulsed production periods. From 3 to 2 Ma, organic matter and opal deposition run more or less parallel, but after 2 Ma opal goes out of phase with organic matter. Apparently, this is the point when silicate becomes limiting to opal production. Thus, the MDM conundrum is solved by linking planetary cooling to increased mixing and upwelling (ramping up to the MDM) and a general removal of silicate from the upper ocean through excess precipitation over global supply (ramping down from the MDM). The hypothesis concerning the origin of the Namibia opal acme or MDM is fundamentally the same as the Walvis Hypothesis, stating that glacial conditions result in removal of silicate from the thermocline (and quite likely from the ocean as a whole, given enough time). The Namibia opal acme, and other opal maxima in the latest Neogene in other regions of the ocean, marks the interval when a cooling ocean selectively removes the abundant silicate inherited from a warm ocean. When the excess silicate is removed, the process ceases. According to the data gathered during Leg 175, major upwelling started in the late part of the late Miocene. Presumably, this process contributed to the drawing down of carbon dioxide from the atmosphere, helping to prepare the way for Northern Hemisphere glaciation.
Resumo:
Myanmar maintained a multiple exchange rate system, and the parallel market exchange rate was left untamed. In the last two decades, the Myanmar kyat exchange rate of the parallel market has exhibited the sharpest fluctuations among Southeast Asian currencies in real terms. Since the move to a managed float regime in April 2012, the question arises of whether exchange rate policies will be effective in stabilizing the real exchange rate. This paper investigates the sources of fluctuations in the real effective exchange rate using Blanchard and Quah’s (1989) structural vector autoregression model. As nominal shocks can be created by exchange rate policies, a persistent impact of a nominal shock implies more room for exchange rate policies. Decomposition of the fluctuations into nominal and real shocks indicates that the impact of nominal shocks is small and quickly diminishes, implying that complementary sterilization is necessary for effective foreign exchange market interventions.
Resumo:
Since the early days of logic programming, researchers in the field realized the potential for exploitation of parallelism present in the execution of logic programs. Their high-level nature, the presence of nondeterminism, and their referential transparency, among other characteristics, make logic programs interesting candidates for obtaining speedups through parallel execution. At the same time, the fact that the typical applications of logic programming frequently involve irregular computations, make heavy use of dynamic data structures with logical variables, and involve search and speculation, makes the techniques used in the corresponding parallelizing compilers and run-time systems potentially interesting even outside the field. The objective of this article is to provide a comprehensive survey of the issues arising in parallel execution of logic programming languages along with the most relevant approaches explored to date in the field. Focus is mostly given to the challenges emerging from the parallel execution of Prolog programs. The article describes the major techniques used for shared memory implementation of Or-parallelism, And-parallelism, and combinations of the two. We also explore some related issues, such as memory management, compile-time analysis, and execution visualization.
Resumo:
Enabling real end-user programming development is the next logical stage in the evolution of Internetwide service-based applications. Even so, the vision of end users programming their own web-based solutions has not yet materialized. This will continue to be so unless both industry and the research community rise to the ambitious challenge of devising an end-to-end compositional model for developing a new age of end-user web application development tools. This paper describes a new composition model designed to empower programming-illiterate end users to create and share their own off-the-shelf rich Internet applications in a fully visual fashion. This paper presents the main insights and outcomes of our research and development efforts as part of a number of successful European Union research projects. A framework implementing this model was developed as part of the European Seventh Framework Programme FAST Project and the Spanish EzWeb Project and allowed us to validate the rationale behind our approach.
Resumo:
Membrane systems are parallel and bioinspired systems which simulate membranes behavior when processing information. As a part of unconventional computing, P-systems are proven to be effective in solvingcomplexproblems. A software technique is presented here that obtain good results when dealing with such problems. The rules application phase is studied and updated accordingly to obtain the desired results. Certain rules are candidate to be eliminated which can make the model improving in terms of time.
Resumo:
Growing scarcity, increasing demand and bad management of water resources are causing weighty competition for water and consequently managers are facing more and more pressure in an attempt to satisfy users? requirement. In many regions agriculture is one of the most important users at river basin scale since it concentrates high volumes of water consumption during relatively short periods (irrigation season), with a significant economic, social and environmental impact. The interdisciplinary characteristics of related water resources problems require, as established in the Water Framework Directive 2000/60/EC, an integrated and participative approach to water management and assigns an essential role to economic analysis as a decision support tool. For this reason, a methodology is developed to analyse the economic and environmental implications of water resource management under different scenarios, with a focus on the agricultural sector. This research integrates both economic and hydrologic components in modelling, defining scenarios of water resource management with the goal of preventing critical situations, such as droughts. The model follows the Positive Mathematical Programming (PMP) approach, an innovative methodology successfully used for agricultural policy analysis in the last decade and also applied in several analyses regarding water use in agriculture. This approach has, among others, the very important capability of perfectly calibrating the baseline scenario using a very limited database. However one important disadvantage is its limited capacity to simulate activities non-observed during the reference period but which could be adopted if the scenario changed. To overcome this problem the classical methodology is extended in order to simulate a more realistic farmers? response to new agricultural policies or modified water availability. In this way an economic model has been developed to reproduce the farmers? behaviour within two irrigation districts in the Tiber High Valley. This economic model is then integrated with SIMBAT, an hydrologic model developed for the Tiber basin which allows to simulate the balance between the water volumes available at the Montedoglio dam and the water volumes required by the various irrigation users.
Resumo:
The main purpose of robot calibration is the correction of the possible errors in the robot parameters. This paper presents a method for a kinematic calibration of a parallel robot that is equipped with one camera in hand. In order to preserve the mechanical configuration of the robot, the camera is utilized to acquire incremental positions of the end effector from a spherical object that is fixed in the word reference frame. The positions of the end effector are related to incremental positions of resolvers of the motors of the robot, and a kinematic model of the robot is used to find a new group of parameters which minimizes errors in the kinematic equations. Additionally, properties of the spherical object and intrinsic camera parameters are utilized to model the projection of the object in the image and improving spatial measurements. Finally, the robotic system is designed to carry out tracking tasks and the calibration of the robot is validated by means of integrating the errors of the visual controller.
Resumo:
Irregular computations pose sorne of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures, which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. Starting in the mid 80s there has been significant progress in the development of parallelizing compilers for logic programming (and more recently, constraint programming) resulting in quite capable parallelizers. The typical applications of these paradigms frequently involve irregular computations, and make heavy use of dynamic data structures with pointers, since logical variables represent in practice a well-behaved form of pointers. This arguably makes the techniques used in these compilers potentially interesting. In this paper, we introduce in a tutoríal way, sorne of the problems faced by parallelizing compilers for logic and constraint programs and provide pointers to sorne of the significant progress made in the area. In particular, this work has resulted in a series of achievements in the areas of inter-procedural pointer aliasing analysis for independence detection, cost models and cost analysis, cactus-stack memory management, techniques for managing speculative and irregular computations through task granularity control and dynamic task allocation such as work-stealing schedulers), etc.
Resumo:
The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.
Resumo:
We discuss from a practical point of view a number of ssues involved in writing distributed Internet and WWW applications using LP/CLP systems. We describe PiLLoW, a publicdomain Internet and WWW programming library for LP/CLP systems that we have designed in order to simplify the process of writing such applications. PiLLoW provides facilities for accessing documents and code on the WWW; parsing, manipulating and generating HTML and XML structured documents and data; producing HTML forms; writing form handlers and CGI-scripts; and processing HTML/XML templates. An important contribution of PÍ'LLOW is to model HTML/XML code (and, thus, the content of WWW pages) as terms. The PÍ'LLOW library has been developed in the context of the Ciao Prolog system, but it has been adapted to a number of popular LP/CLP systems, supporting most of its functionality. We also describe the use of concurrency and a highlevel model of client-server interaction, Ciao Prolog's active modules, in the context of WWW programming. We propose a solution for client-side downloading and execution of Prolog code, using generic browsers. Finally, we also provide an overview of related work on the topic.