957 resultados para load-balancing scheduling
Resumo:
Fire design is an essential part of the overall design procedure of structural steel members and systems. Conventionally, increased fire rating is provided simply by adding more plasterboards to Light gauge Steel Frame (LSF) stud walls, which is inefficient. However, recently Kolarkar & Mahendran (2008) developed a new composite wall panel system, where the insulation was located externally between the plasterboards on both sides of the steel wall frame. Numerical and experimental studies were undertaken to investigate the structural and fire performance of LSF walls using the new composite panels under axial compression. This paper presents the details of the numerical studies of the new LSF walls and the results. It also includes brief details of the experimental studies. Experimental and numerical results were compared for the purpose of validating the developed numerical model. The paper also describes the structural and fire performance of the new LSF wall system in comparison to traditional wall systems using cavity insulation.
Resumo:
The objective of this study was to evaluate the feasibility and potential of a hybrid scaffold system in large- and high-load-bearing osteochondral defects repair. The implants were made of medical-grade PCL (mPCL) for the bone compartment whereas fibrin glue was used for the cartilage part. Both matrices were seeded with allogenic bone marrow-derived mesenchymal cells (BMSC) and implanted in the defect (4 mm diameter×5 mm depth) on medial femoral condyle of adult New Zealand White rabbits. Empty scaffolds were used at the control side. Cell survival was tracked via fluorescent labeling. The regeneration process was evaluated by several techniques at 3 and 6 months post-implantation. Mature trabecular bone regularly formed in the mPCL scaffold at both 3 and 6 months post-operation. Micro-Computed Tomography showed progression of mineralization from the host–tissue interface towards the inner region of the grafts. At 3 months time point, the specimens showed good cartilage repair. In contrast, the majority of 6 months specimens revealed poor remodeling and fissured integration with host cartilage while other samples could maintain good cartilage appearance. In vivo viability of the transplanted cells was demonstrated for the duration of 5 weeks. The results demonstrated that mPCL scaffold is a potential matrix for osteochondral bone regeneration and that fibrin glue does not inherit the physical properties to allow for cartilage regeneration in a large and high-load-bearing defect site. Keywords: Osteochondral tissue engineering; Scaffold; Bone marrow-derived precursor cells; Fibrin glue
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
System analysis within the traction power system is vital to the design and operation of an electrified railway. Loads in traction power systems are often characterised by their mobility, wide range of power variations, regeneration and service dependence. In addition, the feeding systems may take different forms in AC electrified railways. Comprehensive system studies are usually carried out by computer simulation. A number of traction power simulators have been available and they allow calculation of electrical interaction among trains and deterministic solutions of the power network. In the paper, a different approach is presented to enable load-flow analysis on various feeding systems and service demands in AC railways by adopting probabilistic techniques. It is intended to provide a different viewpoint to the load condition. Simulation results are given to verify the probabilistic-load-flow models.
Resumo:
In rural low-voltage networks, distribution lines are usually highly resistive. When many distributed generators are connected to such lines, power sharing among them is difficult when using conventional droop control, as the real and reactive power have strong coupling with each other. A high droop gain can alleviate this problem but may lead the system to instability. To overcome4 this, two droop control methods are proposed for accurate load sharing with frequency droop controller. The first method considers no communication among the distributed generators and regulates the output voltage and frequency, ensuring acceptable load sharing. The droop equations are modified with a transformation matrix based on the line R/X ration for this purpose. The second proposed method, with minimal low bandwidth communication, modifies the reference frequency of the distributed generators based on the active and reactive power flow in the lines connected to the points of common coupling. The performance of these two proposed controllers is compared with that of a controller, which includes an expensive high bandwidth communication system through time-domain simulation of a test system. The magnitude of errors in power sharing between these three droop control schemes are evaluated and tabulated.
Resumo:
A schedule coordination problem involving two train services provided by different operators is modeled as an optimization of revenue intake. The coordination is achieved through the adjustment of commencement times of the train services by negotiation. The problem is subject to constraints regarding to passenger demands and idle costs of rolling-stocks from both operators. This paper models the operators as software agents having the flexibility to incorporate one of the two (and potentially more) proposed negotiation strategies. Empirical results show that agents employing different combination of strategies have significant impact on the quality of solution and negotiation time.
Resumo:
With the recent regulatory reforms in a number of countries, railways resources are no longer managed by a single party but are distributed among different stakeholders. To facilitate the operation of train services, a train service provider (SP) has to negotiate with the infrastructure provider (IP) for a train schedule and the associated track access charge. This paper models the SP and IP as software agents and the negotiation as a prioritized fuzzy constraint satisfaction (PFCS) problem. Computer simulations have been conducted to demonstrate the effects on the train schedule when the SP has different optimization criteria. The results show that by assigning different priorities on the fuzzy constraints, agents can represent SPs with different operational objectives.
Resumo:
Probabilistic load flow techniques have been adopted in AC electrified railways to study the load demand under various train service conditions. This paper highlights the differences in probabilistic load flow analysis between the usual power systems and power supply systems in AC railways; discusses the possible difficulties in problem formulation and presents the link between train movement and the corresponding power demand for load flow calculation.
Resumo:
Power load flow analysis is essential for system planning, operation, development and maintenance. Its application on railway supply system is no exception. Railway power supplies system distinguishes itself in terms of load pattern and mobility, as well as feeding system structure. An attempt has been made to apply probability load flow (PLF) techniques on electrified railways in order to examine the loading on the feeding substations and the voltage profiles of the trains. This study is to formulate a simple and reliable model to support the necessary calculations for probability load flow analysis in railway systems with autotransformer (AT) feeding system, and describe the development of a software suite to realise the computation.
Resumo:
Emotions play a central role in mediation as they help to define the scope and direction of a conflict. When a party to mediation expresses (and hence entrusts) their emotions to those present in a mediation, a mediator must do more than simply listen - they must attend to these emotions. Mediator empathy is an essential skill for communicating to a party that their feelings have been heard and understood, but it can lead mediators into trouble. Whilst there might exist a theoretical divide between the notions of empathy and sympathy, the very best characteristics of mediators (caring and compassionate nature) may see empathy and sympathy merge - resulting in challenges to mediator neutrality. This article first outlines the semantic difference between empathy and sympathy and the role that intrapsychic conflict can play in the convergence of these behavioural phenomena. It then defines emotional intelligence in the context of a mediation, suggesting that only the most emotionally intelligent mediators are able to emotionally connect with the parties, but maintain an impression of impartiality – the quality of remaining ‘attached yet detached’ to the process. It is argued that these emotionally intelligent mediators have the common qualities of strong self-awareness and emotional self-regulation.
Resumo:
A Flat Bed Rail Wagon (FBRW) has been proposed as an alternative solution for replacing bridges on low traffic volume roads. The subject matter for this paper is to investigate the impediment to load transfer from cross girders to main girder, through visually identifiable structural flaws. Namely, the effect of having large openings at close proximity to the connection of the main girder to the cross girder of a FBRW was examined. It was clear that openings locally reduce the section modulus of the secondary members; however it was unclear how these reductions would affect the load transfer to the main girder. The results are presented through modeling grillage action for which the loads applied onto the FBRW were distributed through cross girders to the main girder.
Resumo:
This paper proposes a novel peak load management scheme for rural areas. The scheme transfers certain customers onto local nonembedded generators during peak load periods to alleviate network under voltage problems. This paper develops and presents this system by way of a case study in Central Queensland, Australia. A methodology is presented for determining the best location for the nonembedded generators as well as the number of generators required to alleviate network problems. A control algorithm to transfer and reconnect customers is developed to ensure that the network voltage profile remains within specification under all plausible load conditions. Finally, simulations are presented to show the performance of the system over a typical maximum daily load profile with large stochastic load variations.
Resumo:
In cloud computing resource allocation and scheduling of multiple composite web services is an important challenge. This is especially so in a hybrid cloud where there may be some free resources available from private clouds but some fee-paying resources from public clouds. Meeting this challenge involves two classical computational problems. One is assigning resources to each of the tasks in the composite web service. The other is scheduling the allocated resources when each resource may be used by more than one task and may be needed at different points of time. In addition, we must consider Quality-of-Service issues, such as execution time and running costs. Existing approaches to resource allocation and scheduling in public clouds and grid computing are not applicable to this new problem. This paper presents a random-key genetic algorithm that solves new resource allocation and scheduling problem. Experimental results demonstrate the effectiveness and scalability of the algorithm.