970 resultados para scheduling techniques, optimise rail operations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are 23,500 level crossings in Australia. In these types of environments it is important to understand what human factor issues are present and how road users and pedestrians engage with crossings. A series of on-site observations were performed over a 2-day period at a 3-track active crossing. This was followed by 52 interviews with local business owners and members of the public. Data were captured using a manual-coding scheme for recording and categorising violations. Over 700 separate road user and pedestrian violations were recorded, with representations in multiple categories. Time stamping revealed that the crossing was active for 59% of the time in some morning periods. Further, trains could take up to 4-min to arrive following its first activation. Many pedestrians jaywalked under side rails and around active boom gates. In numerous cases pedestrians put themselves at risk in order to beat or catch the approaching train, ignored signs to stop walking when the lights were flashing. Analysis of interview data identified themes associated with congestion, safety, and violations. This work offers insight into context specific issues associated with active level crossing protection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of scheduling divisible loads in distributed computing systems, in presence of processor release time is considered. The objective is to find the optimal sequence of load distribution and the optimal load fractions assigned to each processor in the system such that the processing time of the entire processing load is a minimum. This is a difficult combinatorial optimization problem and hence genetic algorithms approach is presented for its solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Close to one half of the LHC events are expected to be due to elastic or inelastic diffractive scattering. Still, predictions based on extrapolations of experimental data at lower energies differ by large factors in estimating the relative rate of diffractive event categories at the LHC energies. By identifying diffractive events, detailed studies on proton structure can be carried out. The combined forward physics objects: rapidity gaps, forward multiplicity and transverse energy flows can be used to efficiently classify proton-proton collisions. Data samples recorded by the forward detectors, with a simple extension, will allow first estimates of the single diffractive (SD), double diffractive (DD), central diffractive (CD), and non-diffractive (ND) cross sections. The approach, which uses the measurement of inelastic activity in forward and central detector systems, is complementary to the detection and measurement of leading beam-like protons. In this investigation, three different multivariate analysis approaches are assessed in classifying forward physics processes at the LHC. It is shown that with gene expression programming, neural networks and support vector machines, diffraction can be efficiently identified within a large sample of simulated proton-proton scattering events. The event characteristics are visualized by using the self-organizing map algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vegetative cells and zygotes of Saccharomyces carlsbergensis fixed in iodine formaldehyde acetic acid solution and stained after acid hydrolysis in hæmatoxylin, Feulgen and Giemsa show a remarkable similarity in the size and orientation of the structures in the nuclear matrix with reference to the nuclear membrane. The nucleolus described by Guilliermond may either be the chromocenter or the nucleolar equivalent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The swelling pressure of soil depends upon various soil parameters such as mineralogy, clay content, Atterberg's limits, dry density, moisture content, initial degree of saturation, etc. along with structural and environmental factors. It is very difficult to model and analyze swelling pressure effectively taking all the above aspects into consideration. Various statistical/empirical methods have been attempted to predict the swelling pressure based on index properties of soil. In this paper, the computational intelligence techniques artificial neural network and support vector machine have been used to develop models based on the set of available experimental results to predict swelling pressure from the inputs; natural moisture content, dry density, liquid limit, plasticity index, and clay fraction. The generalization of the model to new set of data other than the training set of data is discussed which is required for successful application of a model. A detailed study of the relative performance of the computational intelligence techniques has been carried out based on different statistical performance criteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The swelling pressure of soil depends upon various soil parameters such as mineralogy, clay content, Atterberg's limits, dry density, moisture content, initial degree of saturation, etc. along with structural and environmental factors. It is very difficult to model and analyze swelling pressure effectively taking all the above aspects into consideration. Various statistical/empirical methods have been attempted to predict the swelling pressure based on index properties of soil. In this paper, the computational intelligence techniques artificial neural network and support vector machine have been used to develop models based on the set of available experimental results to predict swelling pressure from the inputs; natural moisture content, dry density, liquid limit, plasticity index, and clay fraction. The generalization of the model to new set of data other than the training set of data is discussed which is required for successful application of a model. A detailed study of the relative performance of the computational intelligence techniques has been carried out based on different statistical performance criteria.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study investigates whether there is an association between different combinations of emphasis on generic strategies (product differentiation and cost efficiency) and perceived usefulness of management accounting techniques. Previous research has found that cost leadership is associated with traditional accounting techniques and product differentiation with a variety of modern management accounting approaches. The present study focuses on the possible existence of a strategy that mixes these generic strategies. The empirical results suggest that (a) there is no difference in the attitudes towards the usefulness of traditional management accounting techniques between companies that adhere either to a single strategy or a mixed strategy; (b) there is no difference in the attitudes towards modern and traditional techniques between companies that adhere to a single strategy, whether this is product differentiation or cost efficiency, and c) companies that favour a mixed strategy seem to have a more positive attitude towards modern techniques than companies adhering to a single strategy