906 resultados para Lot-sizing and scheduling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le Système Stockage de l’Énergie par Batterie ou Batterie de Stockage d’Énergie (BSE) offre de formidables atouts dans les domaines de la production, du transport, de la distribution et de la consommation d’énergie électrique. Cette technologie est notamment considérée par plusieurs opérateurs à travers le monde entier, comme un nouveau dispositif permettant d’injecter d’importantes quantités d’énergie renouvelable d’une part et d’autre part, en tant que composante essentielle aux grands réseaux électriques. De plus, d’énormes avantages peuvent être associés au déploiement de la technologie du BSE aussi bien dans les réseaux intelligents que pour la réduction de l’émission des gaz à effet de serre, la réduction des pertes marginales, l’alimentation de certains consommateurs en source d’énergie d’urgence, l’amélioration de la gestion de l’énergie, et l’accroissement de l’efficacité énergétique dans les réseaux. Cette présente thèse comprend trois étapes à savoir : l’Étape 1 - est relative à l’utilisation de la BSE en guise de réduction des pertes électriques ; l’Étape 2 - utilise la BSE comme élément de réserve tournante en vue de l’atténuation de la vulnérabilité du réseau ; et l’Étape 3 - introduit une nouvelle méthode d’amélioration des oscillations de fréquence par modulation de la puissance réactive, et l’utilisation de la BSE pour satisfaire la réserve primaire de fréquence. La première Étape, relative à l’utilisation de la BSE en vue de la réduction des pertes, est elle-même subdivisée en deux sous-étapes dont la première est consacrée à l’allocation optimale et le seconde, à l’utilisation optimale. Dans la première sous-étape, l’Algorithme génétique NSGA-II (Non-dominated Sorting Genetic Algorithm II) a été programmé dans CASIR, le Super-Ordinateur de l’IREQ, en tant qu’algorithme évolutionniste multiobjectifs, permettant d’extraire un ensemble de solutions pour un dimensionnement optimal et un emplacement adéquat des multiple unités de BSE, tout en minimisant les pertes de puissance, et en considérant en même temps la capacité totale des puissances des unités de BSE installées comme des fonctions objectives. La première sous-étape donne une réponse satisfaisante à l’allocation et résout aussi la question de la programmation/scheduling dans l’interconnexion du Québec. Dans le but de réaliser l’objectif de la seconde sous-étape, un certain nombre de solutions ont été retenues et développées/implantées durant un intervalle de temps d’une année, tout en tenant compte des paramètres (heure, capacité, rendement/efficacité, facteur de puissance) associés aux cycles de charge et de décharge de la BSE, alors que la réduction des pertes marginales et l’efficacité énergétique constituent les principaux objectifs. Quant à la seconde Étape, un nouvel indice de vulnérabilité a été introduit, formalisé et étudié ; indice qui est bien adapté aux réseaux modernes équipés de BES. L’algorithme génétique NSGA-II est de nouveau exécuté (ré-exécuté) alors que la minimisation de l’indice de vulnérabilité proposé et l’efficacité énergétique représentent les principaux objectifs. Les résultats obtenus prouvent que l’utilisation de la BSE peut, dans certains cas, éviter des pannes majeures du réseau. La troisième Étape expose un nouveau concept d’ajout d’une inertie virtuelle aux réseaux électriques, par le procédé de modulation de la puissance réactive. Il a ensuite été présenté l’utilisation de la BSE en guise de réserve primaire de fréquence. Un modèle générique de BSE, associé à l’interconnexion du Québec, a enfin été proposé dans un environnement MATLAB. Les résultats de simulations confirment la possibilité de l’utilisation des puissances active et réactive du système de la BSE en vue de la régulation de fréquence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following the workshop on new developments in daily licensing practice in November 2011, we brought together fourteen representatives from national consortia (from Denmark, Germany, Netherlands and the UK) and publishers (Elsevier, SAGE and Springer) met in Copenhagen on 9 March 2012 to discuss provisions in licences to accommodate new developments. The one day workshop aimed to: present background and ideas regarding the provisions KE Licensing Expert Group developed; introduce and explain the provisions the invited publishers currently use;ascertain agreement on the wording for long term preservation, continuous access and course packs; give insight and more clarity about the use of open access provisions in licences; discuss a roadmap for inclusion of the provisions in the publishers’ licences; result in report to disseminate the outcome of the meeting. Participants of the workshop were: United Kingdom: Lorraine Estelle (Jisc Collections) Denmark: Lotte Eivor Jørgensen (DEFF), Lone Madsen (Southern University of Denmark), Anne Sandfær (DEFF/Knowledge Exchange) Germany: Hildegard Schaeffler (Bavarian State Library), Markus Brammer (TIB) The Netherlands: Wilma Mossink (SURF), Nol Verhagen (University of Amsterdam), Marc Dupuis (SURF/Knowledge Exchange) Publishers: Alicia Wise (Elsevier), Yvonne Campfens (Springer), Bettina Goerner (Springer), Leo Walford (Sage) Knowledge Exchange: Keith Russell The main outcome of the workshop was that it would be valuable to have a standard set of clauses which could used in negotiations, this would make concluding licences a lot easier and more efficient. The comments on the model provisions the Licensing Expert group had drafted will be taken into account and the provisions will be reformulated. Data and text mining is a new development and demand for access to allow for this is growing. It would be easier if there was a simpler way to access materials so they could be more easily mined. However there are still outstanding questions on how authors of articles that have been mined can be properly attributed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zinc stable isotopes measurements by MC-ICP-MS, validated by laboratory intercalibrations, were performed on wild oysters, suspended particles and filtered river/estuarine water samples to provide new constraints for the use of Zn isotopes as environmental tracers. The samples selected were representative of the long range (400 km) transport of metal (Zn, Cd, etc.) contamination from former Zn-refining activities at Decazeville (i.e. δ66Zn > 1 ‰) and its phasing out, recorded during 30 years in wild oysters from the Gironde Estuary mouth (RNO/ROCCH sample bank). The study also addresses additional anthropogenic sources (urban and viticulture) and focuses on geochemical reactivity of Zn in the turbidity gradient and the maximum turbidity zone (MTZ) of the fluvial Gironde Estuary. In this area, dissolved Zn showed a strong removal onto suspended particulate matter (SPM) and progressive enrichment in heavy isotopes with increasing SPM concentrations varying from δ66Zn = -0.02 ‰ at 2 mg/L to +0.90 ‰ at 1310 mg/L. These signatures were attributed to kinetically driven adsorption due to strongly increasing sorption sites in the turbidity gradient and MTZ of the estuary. Oysters from the estuary mouth, contaminated sediments from the Lot River and SPM entering the estuary showed parallel historical evolutions (1979-2010) for Zn/Cd ratios but not for δ66Zn values. Oysters had signatures varying from δ66Zn = 1.43 ‰ in 1983 to 1.18 ‰ in 2010 and were offset by δ66Zn = 0.6 - 0.7 ‰ compared to past (1988) and present SPM from the salinity gradient. Isotopic signatures in river-borne particles entering the Gironde Estuary under contrasting freshwater discharge regimes during 2003-2011 showed similar values (δ66Zn ≈ 0.35 ± 0.03 ‰; 1SD, n=15), i.e. they were neither related to former metal refining activities at least for the past decade nor clearly affected by other anthropogenic sources. Therefore, the Zn isotopic signatures in Gironde oysters reflect the geochemical reactivity of Zn in the estuary rather than signatures of past metallurgical contaminations in the watershed as recorded in contaminated river sediments. The study also shows that the isotopic composition of Zn is strongly fractionated by its geochemical reactivity in the Gironde Estuary, representative of meso-macrotidal estuarine systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cadastral map showing lot lines, lot numbers, and block numbers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cadastral map showing lot lines, lot numbers, and block numbers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In aircraft components maintenance shops, components are distributed amongst repair groups and their respective technicians based on the type of repair, on the technicians skills and workload, and on the customer required dates. This distribution planning is typically done in an empirical manner based on the group leader’s past experience. Such a procedure does not provide any performance guarantees, leading frequently to undesirable delays on the delivery of the aircraft components. Among others, a fundamental challenge faced by the group leaders is to decide how to distribute the components that arrive without customer required dates. This paper addresses the problems of prioritizing the randomly arriving of aircraft components (with or without pre-assigned customer required dates) and of optimally distributing them amongst the technicians of the repair groups. We proposed a formula for prioritizing the list of repairs, pointing out the importance of selecting good estimators for the interarrival times between repair requests, the turn-around-times and the man hours for repair. In addition, a model for the assignment and scheduling problem is designed and a preliminary algorithm along with a numerical illustration is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability to accurately predict the lifetime of building components is crucial to optimizing building design, material selection and scheduling of required maintenance. This paper discusses a number of possible data mining methods that can be applied to do the lifetime prediction of metallic components and how different sources of service life information could be integrated to form the basis of the lifetime prediction model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report is an attempt to present the current state of product and process modelling in the building industry in general, and in construction planning and scheduling in particular. This report endeavours to describe what has been achieved by the Construction Planning Workbench (CPW) project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of an ARC Discovery project to write a history of Australian television from the point of view of audiences, I looked for Australian television fan communities. It transpired that the most productive communities exist around imported programming like the BBC’s Doctor Who. This program is an Australian television institution – and I was thus interested in finding out whether it should be included in an audience-centred history of Australian television. Research in archives of fan materials showed that the program has been made distinctively Australian through censorship and scheduling practices. There are uniquely Australian social practices built around it. Also, its very Britishness has become part of its being – in a sense - Australian. Through all of this, there is a clear awareness that this Australian institution originates somewhere else – that for these fans Australia is always secondary, relying on other countries to produce its myths for it, no matter how much it might reshape them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Heavy vehicle transportation continues to grow internationally; yet crash rates are high, and the risk of injury and death extends to all road users. The work environment for the heavy vehicle driver poses many challenges; conditions such as scheduling and payment are proposed risk factors for crash, yet the precise measure of these needs quantifying. Other risk factors such as sleep disorders including obstructive sleep apnoea have been shown to increase crash risk in motor vehicle drivers however the risk of heavy vehicle crash from this and related health conditions needs detailed investigation. Methods and Design The proposed case control study will recruit 1034 long distance heavy vehicle drivers: 517 who have crashed and 517 who have not. All participants will be interviewed at length, regarding their driving and crash history, typical workloads, scheduling and payment, trip history over several days, sleep patterns, health, and substance use. All participants will have administered a nasal flow monitor for the detection of obstructive sleep apnoea. Discussion Significant attention has been paid to the enforcement of legislation aiming to deter problems such as excess loading, speeding and substance use; however, there is inconclusive evidence as to the direction and strength of associations of many other postulated risk factors for heavy vehicle crashes. The influence of factors such as remuneration and scheduling on crash risk is unclear; so too the association between sleep apnoea and the risk of heavy vehicle driver crash. Contributory factors such as sleep quality and quantity, body mass and health status will be investigated. Quantifying the measure of effect of these factors on the heavy vehicle driver will inform policy development that aims toward safer driving practices and reduction in heavy vehicle crash; protecting the lives of many on the road network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a method for calculating the in-bucket payload volume on a dragline for the purpose of estimating the material’s bulk density in real-time. Knowledge of the bulk density can provide instant feedback to mine planning and scheduling to improve blasting and in turn provide a more uniform bulk density across the excavation site. Furthermore costs and emissions in dragline operation, maintenance and downstream material processing can be reduced. The main challenge is to determine an accurate position and orientation of the bucket with the constraint of real-time performance. The proposed solution uses a range bearing and tilt sensor to locate and scan the bucket between the lift and dump stages of the dragline cycle. Various scanning strategies are investigated for their benefits in this real-time application. The bucket is segmented from the scene using cluster analysis while the pose of the bucket is calculated using the iterative closest point (ICP) algorithm. Payload points are segmented from the bucket by a fixed distance neighbour clustering method to preserve boundary points and exclude low density clusters introduced by overhead chains and the spreader bar. A height grid is then used to represent the payload from which the volume can be calculated by summing over the grid cells. We show volume calculated on a scaled system with an accuracy of greater than 95 per cent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a method for measuring the in-bucket payload volume on a dragline excavator for the purpose of estimating the material's bulk density in real-time. Knowledge of the payload's bulk density can provide feedback to mine planning and scheduling to improve blasting and therefore provide a more uniform bulk density across the excavation site. This allows a single optimal bucket size to be used for maximum overburden removal per dig and in turn reduce costs and emissions in dragline operation and maintenance. The proposed solution uses a range bearing laser to locate and scan full buckets between the lift and dump stages of the dragline cycle. The bucket is segmented from the scene using cluster analysis, and the pose of the bucket is calculated using the Iterative Closest Point (ICP) algorithm. Payload points are identified using a known model and subsequently converted into a height grid for volume estimation. Results from both scaled and full scale implementations show that this method can achieve an accuracy of above 95%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We suspect that the array of silly names used to refer to temporary staff worldwide may be indicative of the extent to which these nurses have been relegated to, and we would argue, remain in, a type of underclass – relatively unsupported by employers in terms of professional practice and ipso facto excluded from contributing professionally to team work, practice development, clinical governance and evidence based practice. This may be acceptable to some but in a climate of risk averseness and in the interests of strategic planning we would suggest it is an accident waiting to happen. The recent UK Royal College of Nursing (RCN) (Ball & Pike, 2006) survey of bank and agency nurses brings a welcome focus on a group of nurses that make a significant contribution to the smooth running of health services in many countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web service technology is increasingly being used to build various e-Applications, in domains such as e-Business and e-Science. Characteristic benefits of web service technology are its inter-operability, decoupling and just-in-time integration. Using web service technology, an e-Application can be implemented by web service composition — by composing existing individual web services in accordance with the business process of the application. This means the application is provided to customers in the form of a value-added composite web service. An important and challenging issue of web service composition, is how to meet Quality-of-Service (QoS) requirements. This includes customer focused elements such as response time, price, throughput and reliability as well as how to best provide QoS results for the composites. This in turn best fulfils customers’ expectations and achieves their satisfaction. Fulfilling these QoS requirements or addressing the QoS-aware web service composition problem is the focus of this project. From a computational point of view, QoS-aware web service composition can be transformed into diverse optimisation problems. These problems are characterised as complex, large-scale, highly constrained and multi-objective problems. We therefore use genetic algorithms (GAs) to address QoS-based service composition problems. More precisely, this study addresses three important subproblems of QoS-aware web service composition; QoS-based web service selection for a composite web service accommodating constraints on inter-service dependence and conflict, QoS-based resource allocation and scheduling for multiple composite services on hybrid clouds, and performance-driven composite service partitioning for decentralised execution. Based on operations research theory, we model the three problems as a constrained optimisation problem, a resource allocation and scheduling problem, and a graph partitioning problem, respectively. Then, we present novel GAs to address these problems. We also conduct experiments to evaluate the performance of the new GAs. Finally, verification experiments are performed to show the correctness of the GAs. The major outcomes from the first problem are three novel GAs: a penaltybased GA, a min-conflict hill-climbing repairing GA, and a hybrid GA. These GAs adopt different constraint handling strategies to handle constraints on interservice dependence and conflict. This is an important factor that has been largely ignored by existing algorithms that might lead to the generation of infeasible composite services. Experimental results demonstrate the effectiveness of our GAs for handling the QoS-based web service selection problem with constraints on inter-service dependence and conflict, as well as their better scalability than the existing integer programming-based method for large scale web service selection problems. The major outcomes from the second problem has resulted in two GAs; a random-key GA and a cooperative coevolutionary GA (CCGA). Experiments demonstrate the good scalability of the two algorithms. In particular, the CCGA scales well as the number of composite services involved in a problem increases, while no other algorithms demonstrate this ability. The findings from the third problem result in a novel GA for composite service partitioning for decentralised execution. Compared with existing heuristic algorithms, the new GA is more suitable for a large-scale composite web service program partitioning problems. In addition, the GA outperforms existing heuristic algorithms, generating a better deployment topology for a composite web service for decentralised execution. These effective and scalable GAs can be integrated into QoS-based management tools to facilitate the delivery of feasible, reliable and high quality composite web services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing stock of aging office buildings will see a significant growth in retrofitting projects in Australian capital cities. Stakeholders of refitting works will also need to take on the sustainability challenge and realize tangible outcomes through project delivery. Traditionally, decision making for aged buildings, when facing the alternatives, is typically economically driven and on ad hoc basis. This leads to the tendency to either delay refitting for as long as possible thus causing building conditions to deteriorate, or simply demolish and rebuild with unjust financial burden. The technologies involved are often limited to typical strip-clean and repartition with dry walls and office cubicles. Changing business operational patterns, the efficiency of office space, and the demand on improved workplace environment, will need more innovative and intelligent approaches to refurbishing office buildings. For example, such projects may need to respond to political, social, environmental and financial implications. There is a need for the total consideration of buildings structural assessment, modeling of operating and maintenance costs, new architectural and engineering designs that maximise the utility of the existing structure and resulting productivity improvement, specific construction management procedures including procurement methods, work flow and scheduling and occupational health and safety. Recycling potential and conformance to codes may be other major issues. This paper introduces examples of Australian research projects which provided a more holistic approach to the decision making of refurbishing office space, using appropriate building technologies and products, assessment of residual service life, floor space optimisation and project procurement in order to bring about sustainable outcomes. The paper also discusses a specific case study on critical factors that influence key building components for these projects and issues for integrated decision support when dealing with the refurbishment, and indeed the “re-life”, of office buildings.