900 resultados para Hard combinatorial scheduling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this project was to introduce a new software product to pulp industry, a new market for case company. An optimization based scheduling tool has been developed to allow pulp operations to better control their production processes and improve both production efficiency and stability. Both the work here and earlier research indicates that there is a potential for savings around 1-5%. All the supporting data is available today coming from distributed control systems, data historians and other existing sources. The pulp mill model together with the scheduler, allows what-if analyses of the impacts and timely feasibility of various external actions such as planned maintenance of any particular mill operation. The visibility gained from the model proves also to be a real benefit. The aim is to satisfy demand and gain extra profit, while achieving the required customer service level. Research effort has been put both in understanding the minimum features needed to satisfy the scheduling requirements in the industry and the overall existence of the market. A qualitative study was constructed to both identify competitive situation and the requirements vs. gaps on the market. It becomes clear that there is no such system on the marketplace today and also that there is room to improve target market overall process efficiency through such planning tool. This thesis also provides better overall understanding of the different processes in this particular industry for the case company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Edible films based on gluten from four types of Brazilian wheat gluten (2 "semi-hard" and 2 "soft") were prepared and mechanical and barrier properties were compared with those of wheat gluten films with vital gluten. Water vapor, oxygen permeability, tensile strength and percent elongation at break, solubility in water and surface morphology were measured. The films from "semi-hard" wheat flours showed similar water vapor permeability and solubility in water to films from vital gluten and better tensile strength than the films from "soft" and vital gluten. The films from vital gluten had higher elongation at break and oxygen permeability and also lower solubility in water than the films from the Brazilian wheat "soft" flours. In spite of the vital gluten showed greater mechanical resistance, desirable for the bakery products, for the purpose of developing gluten films Brazilian "semi-hard" wheat flours can be used instead of vital gluten, since they showed similar barrier and mechanical properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Operational excellence of individual tramp shipping companies is important in today’s market, where competition is intense, freight revenues are modest and capital costs high due to global financial crisis, and tighter regulatory framework is generating additional costs and challenges to the industry. This thesis concentrates on tramp shipping, where a tramp operator in a form of an individual case company, specialized in short-sea shipping activities in the Baltic Sea region, is searching ways to map their current fleet operations and better understand potential ways to improve the overall routing and scheduling decisions. The research problem is related to tramp fleet planning where several cargoes are carried on board at the same time, which are here systematically referred to as part cargoes. The purpose is to determine the pivotal dimensions and characteristics of these part cargo operations in tramp shipping, and offer both the individual case company and wider research community better understanding of potential risks and benefits related to utilization of part cargo operations. A mixed method research approach is utilized in this research, as the objectives are related to complex, real-life business practices in the field of supply chain management and more specifically, maritime logistics. A quantitative analysis of different voyage scenarios is executed, including alternative voyage legs with varying cost structure and customer involvement. An on-line-based questionnaire designed and prepared by case company’s decision group again provides desired data of predominant attitudes and views of most important industrial customers regarding the part cargo-related operations and potential future utilization of this business model. The results gained from these quantitative methods are complied with qualitative data collection tools, along with suitable secondary data sources. Based on results and logical analysis of different data sources, a framework for characterizing the different aspects of part cargo operations is developed, utilizing both existing research and empirical investigation of the phenomenon. As conclusions, part cargoes have the ability to be part of viable fleet operations, and even increase flexibility among the fleet to a certain extent. Naturally, several hinderers for this development is recognized as well, such as potential issues with information gathering and sharing, inefficient port activities, and increased transit times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several irrigation treatments were evaluated on Sovereign Coronation table grapes at two sites over a 3-year period in the cool humid Niagara Peninsula of Ontario. Trials were conducted in the Hippie (Beamsville, ON) and the Lambert Vineyards (Niagara-on-the-Lake, ON) in 2003 to 2005 with the objective of assessing the usefulness of the modified Penman-Monteith equation to accurately schedule vine irrigation needs. Data (relative humidity, windspeed, solar radiation, and temperature) required to precisely calculate evapotranspiration (ETq) were downloaded from the Ontario Weather Network. One of two ETq values (either 100 or 150%) were used in combination with one of two crop coefficients (Kc; either fixed at 0.75 or 0.2 to 0.8 based upon increasing canopy volume) to calculate the amount of irrigation water required. Five irrigation treatments were: un irrigated control; (lOOET) X Kc =0.75; 150ET X Kc =0.75; lOOET X Kc =0.2-0.8; 150ET X Kc =0.2-0.8. Transpiration, water potential (v|/), and soil moisture data were collected each growing seasons. Yield component data was collected and berries from each treatment were analyzed for soluble solids (Brix), pH, titratable acidity (TA), anthocyanins, methyl anthranilate (MA), and total volatile esters (TVE). Irrigation showed a substantial positive effect on transpiration rate and soil moisture; the control treatment showed consistently lower transpiration and soil moisture over the 3 seasons. Transpiration appeared accurately reflect Sovereign Coronation grapevines water status. Soil moisture also accurately reflected level of irrigation. Moreover, irrigation showed impact of leaf \|/, which was more negative throughout the 3 seasons for vines that were not irrigated. Irrigation had a substantial positive effect on yield (kg/vine) and its various components (clusters/vine, cluster weight, and berries/cluster) in 2003 and 2005. Berry weights were higher under the irrigated treatments at both sites. Berry weight consistently appeared to be the main factor leading to these increased yields, as inconsistent responses were noted for some yield variables. Soluble solids was highest under the ET150 and ET100 treatments both with Kc at 0.75. Both pH and TA were highest under control treatments in 2003 and 2004, but highest under irrigated treatments in 2005. Anthocyanins and phenols were highest under the control treatments in 2003 and 2004, but highest under irrigated treatments in 2005. MA and TVE were highest under the ET150 treatments. Vine and soil water status measurements (soil moisture, leaf \|/, and transpiration) confirmed that irrigation was required for the summers of 2003 and 2005 due to dry weather in those years. They also partially supported the hypothesis that the Penman-Monteith equation is useful for calculating vineyard water needs. Both ET treatments gave clear evidence that irrigation could be effective in reducing water stress and for improving vine performance, yield and fruit composition. Use of properly scheduled irrigation was beneficial for Sovereign Coronation table grapes in the Niagara region. Findings herein should give growers some strong guidehnes on when, how and how much to irrigate their vineyards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This qualitative study explored secondary teachers' perceptions of scheduling in relation to pedagogy, curriculum, and observation of student learning. Its objective was to determine the best way to organize the scheduling for the delivery of Ontario's new 4-year curriculum. Six participants were chosen. Two were teaching in a semestered timetable, 1 in a traditional timetable, and 3 had experience in both schedules. Participants related a pressure cooker "lived experience" with weaker students in the semester system experiencing a particularly harsh environment. The inadequate amount of time for review in content-heavy courses, gap scheduling problems, catch-up difficulties for students missing classes, and the fast pace of semestering are identified as factors negatively impacting on these students. Government testing adds to the pressure by shifting teachers' time and attention in the classroom from deeper learning to a superficial coverage of material, from curriculum as lived to curriculum as text to be covered. Scheduling choice should be available in public education to accommodate the needs of all students. Curriculum guidelines need to be revamped to reflect the content that teachers believe is necessary for a successful course delivery. Applied level courses need to be developed for students who are not academically inferior but learn differently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of a large and reliable DNA codeword library is a key problem in DNA based computing. DNA codes, namely sets of fixed length edit metric codewords over the alphabet {A, C, G, T}, satisfy certain combinatorial constraints with respect to biological and chemical restrictions of DNA strands. The primary constraints that we consider are the reverse--complement constraint and the fixed GC--content constraint, as well as the basic edit distance constraint between codewords. We focus on exploring the theory underlying DNA codes and discuss several approaches to searching for optimal DNA codes. We use Conway's lexicode algorithm and an exhaustive search algorithm to produce provably optimal DNA codes for codes with small parameter values. And a genetic algorithm is proposed to search for some sub--optimal DNA codes with relatively large parameter values, where we can consider their sizes as reasonable lower bounds of DNA codes. Furthermore, we provide tables of bounds on sizes of DNA codes with length from 1 to 9 and minimum distance from 1 to 9.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prediction of proteins' conformation helps to understand their exhibited functions, allows for modeling and allows for the possible synthesis of the studied protein. Our research is focused on a sub-problem of protein folding known as side-chain packing. Its computational complexity has been proven to be NP-Hard. The motivation behind our study is to offer the scientific community a means to obtain faster conformation approximations for small to large proteins over currently available methods. As the size of proteins increases, current techniques become unusable due to the exponential nature of the problem. We investigated the capabilities of a hybrid genetic algorithm / simulated annealing technique to predict the low-energy conformational states of various sized proteins and to generate statistical distributions of the studied proteins' molecular ensemble for pKa predictions. Our algorithm produced errors to experimental results within .acceptable margins and offered considerable speed up depending on the protein and on the rotameric states' resolution used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Département de linguistique et de traduction

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cette thèse étudie une approche intégrant la gestion de l’horaire et la conception de réseaux de services pour le transport ferroviaire de marchandises. Le transport par rail s’articule autour d’une structure à deux niveaux de consolidation où l’affectation des wagons aux blocs ainsi que des blocs aux services représentent des décisions qui complexifient grandement la gestion des opérations. Dans cette thèse, les deux processus de consolidation ainsi que l’horaire d’exploitation sont étudiés simultanément. La résolution de ce problème permet d’identifier un plan d’exploitation rentable comprenant les politiques de blocage, le routage et l’horaire des trains, de même que l’habillage ainsi que l’affectation du traffic. Afin de décrire les différentes activités ferroviaires au niveau tactique, nous étendons le réseau physique et construisons une structure de réseau espace-temps comprenant trois couches dans lequel la dimension liée au temps prend en considération les impacts temporels sur les opérations. De plus, les opérations relatives aux trains, blocs et wagons sont décrites par différentes couches. Sur la base de cette structure de réseau, nous modélisons ce problème de planification ferroviaire comme un problème de conception de réseaux de services. Le modèle proposé se formule comme un programme mathématique en variables mixtes. Ce dernie r s’avère très difficile à résoudre en raison de la grande taille des instances traitées et de sa complexité intrinsèque. Trois versions sont étudiées : le modèle simplifié (comprenant des services directs uniquement), le modèle complet (comprenant des services directs et multi-arrêts), ainsi qu’un modèle complet à très grande échelle. Plusieurs heuristiques sont développées afin d’obtenir de bonnes solutions en des temps de calcul raisonnables. Premièrement, un cas particulier avec services directs est analysé. En considérant une cara ctéristique spécifique du problème de conception de réseaux de services directs nous développons un nouvel algorithme de recherche avec tabous. Un voisinage par cycles est privilégié à cet effet. Celui-ci est basé sur la distribution du flot circulant sur les blocs selon les cycles issus du réseau résiduel. Un algorithme basé sur l’ajustement de pente est développé pour le modèle complet, et nous proposons une nouvelle méthode, appelée recherche ellipsoidale, permettant d’améliorer davantage la qualité de la solution. La recherche ellipsoidale combine les bonnes solutions admissibles générées par l’algorithme d’ajustement de pente, et regroupe les caractéristiques des bonnes solutions afin de créer un problème élite qui est résolu de facon exacte à l’aide d’un logiciel commercial. L’heuristique tire donc avantage de la vitesse de convergence de l’algorithme d’ajustement de pente et de la qualité de solution de la recherche ellipsoidale. Les tests numériques illustrent l’efficacité de l’heuristique proposée. En outre, l’algorithme représente une alternative intéressante afin de résoudre le problème simplifié. Enfin, nous étudions le modèle complet à très grande échelle. Une heuristique hybride est développée en intégrant les idées de l’algorithme précédemment décrit et la génération de colonnes. Nous proposons une nouvelle procédure d’ajustement de pente où, par rapport à l’ancienne, seule l’approximation des couts liés aux services est considérée. La nouvelle approche d’ajustement de pente sépare ainsi les décisions associées aux blocs et aux services afin de fournir une décomposition naturelle du problème. Les résultats numériques obtenus montrent que l’algorithme est en mesure d’identifier des solutions de qualité dans un contexte visant la résolution d’instances réelles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a comparative analysis of corporate law and CSR and asks whether there are lessons for Australia from corporate law and CSR developments in France. This presentation presents a summary of the provisions of the new French Act Number 2010-788 passed on 12 July 2010 – called “Grenelle 2” –. Firstly, article 225 of Law’s Grenelle 2 changes the Commercial Code to extend the reach of non-financial reporting and to ensure its pertinence. Secondly, article 227 Law’s Grenelle 2 amends certain provisions of the Commercial and Environmental Codes and incorporates into substantive law the liability of parent companies for their subsidiaries. In fine, article 224 of Law’s Grenelle 2 reinforces the pressure on the market to act in a responsible manner. It modifies article 214-12 of the Monetary and Financial Code in order to compel institutional investors (mutual funds and fund management companies) to take social, environmental and governance criteria into account in their investment policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Controlled choice over public schools attempts giving options to parents while maintaining diversity, often enforced by setting feasibility constraints with hard upper and lower bounds for each student type. We demonstrate that there might not exist assignments that satisfy standard fairness and non-wastefulness properties; whereas constrained non-wasteful assignments which are fair for same type students always exist. We introduce a "controlled" version of the deferred acceptance algorithm with an improvement stage (CDAAI) that finds a Pareto optimal assignment among such assignments. To achieve fair (across all types) and non-wasteful assignments, we propose the control constraints to be interpreted as soft bounds-flexible limits that regulate school priorities. In this setting, a modified version of the deferred acceptance algorithm (DAASB) finds an assignment that is Pareto optimal among fair assignments while eliciting true preferences. CDAAI and DAASB provide two alternative practical solutions depending on the interpretation of the control constraints. JEL C78, D61, D78, I20.