332 resultados para decomposition techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Load in distribution networks is normally measured at the 11kV supply points; little or no information is known about the type of customers and their contributions to the load. This paper proposes statistical methods to decompose an unknown distribution feeder load to its customer load sector/subsector profiles. The approach used in this paper should assist electricity suppliers in economic load management, strategic planning and future network reinforcements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Australia, railway systems play a vital role in transporting the sugarcane crop from farms to mills. The sugarcane transport system is very complex and uses daily schedules, consisting of a set of locomotives runs, to satisfy the requirements of the mill and harvesters. The total cost of sugarcane transport operations is very high; over 35% of the total cost of sugarcane production in Australia is incurred in cane transport. Efficient schedules for sugarcane transport can reduce the cost and limit the negative effects that this system can have on the raw sugar production system. There are several benefits to formulating the train scheduling problem as a blocking parallel-machine job shop scheduling (BPMJSS) problem, namely to prevent two trains passing in one section at the same time; to keep the train activities (operations) in sequence during each run (trip) by applying precedence constraints; to pass the trains on one section in the correct order (priorities of passing trains) by applying disjunctive constraints; and, to ease passing trains by solving rail conflicts by applying blocking constraints and Parallel Machine Scheduling. Therefore, the sugarcane rail operations are formulated as BPMJSS problem. A mixed integer programming and constraint programming approaches are used to describe the BPMJSS problem. The model is solved by the integration of constraint programming, mixed integer programming and search techniques. The optimality performance is tested by Optimization Programming Language (OPL) and CPLEX software on small and large size instances based on specific criteria. A real life problem is used to verify and validate the approach. Constructive heuristics and new metaheuristics including simulated annealing and tabu search are proposed to solve this complex and NP-hard scheduling problem and produce a more efficient scheduling system. Innovative hybrid and hyper metaheuristic techniques are developed and coded using C# language to improve the solutions quality and CPU time. Hybrid techniques depend on integrating heuristic and metaheuristic techniques consecutively, while hyper techniques are the complete integration between different metaheuristic techniques, heuristic techniques, or both.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fixed bed pyrolysis has been designed and fabricated for obtaining liquid fuel from Mahogany seeds. The major components of the system are fixed bed pyrolysis reactor, liquid condenser and liquid collectors. The Mahogany seed in particle form is pyrolysed in an externally heated 10 cm diameter and 36 cm high fixed bed reactor with nitrogen as the carrier gas. The reactor is heated by means of a biomass source cylindrical heater from 450oC to 600oC. The products are oil, char and gas. The reactor bed temperature, running time and feed particle size are considered as process parameters. A maximum liquid yield of 54wt% of biomass feed is obtained with particle size of 1.18 mm at a reactor bed temperature of 5500C with a running time of 90 minutes. The oil is found to possess favorable flash point and reasonable density and viscosity. The higher calorific value is found to be 39.9 MJ/kg which is higher than other biomass derived pyrolysis oils.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall aim of this project was to contribute to existing knowledge regarding methods for measuring characteristics of airborne nanoparticles and controlling occupational exposure to airborne nanoparticles, and to gather data on nanoparticle emission and transport in various workplaces. The scope of this study involved investigating the characteristics and behaviour of particles arising from the operation of six nanotechnology processes, subdivided into nine processes for measurement purposes. It did not include the toxicological evaluation of the aerosol and therefore, no direct conclusion was made regarding the health effects of exposure to these particles. Our research included real-time measurement of sub, and supermicrometre particle number and mass concentration, count median diameter, and alveolar deposited surface area using condensation particle counters, an optical particle counter, DustTrak photometer, scanning mobility particle sizer, and nanoparticle surface area monitor, respectively. Off-line particle analysis included scanning and transmission electron microscopy, energy-dispersive x-ray spectrometry, and thermal optical analysis of elemental carbon. Sources of fibrous and non-fibrous particles were included.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of techniques for evaluating multivariate volatility forecasts are not yet as well understood as their univariate counterparts. This paper aims to evaluate the efficacy of a range of traditional statistical-based methods for multivariate forecast evaluation together with methods based on underlying considerations of economic theory. It is found that a statistical-based method based on likelihood theory and an economic loss function based on portfolio variance are the most effective means of identifying optimal forecasts of conditional covariance matrices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of nanotechnology products has increased significantly in recent years. With their broad range of applications, including electronics, food and agriculture, power and energy, scientific instruments, clothing, cosmetics, buildings, biomedical and health, etc (Catanzariti, 2008), nanomaterials are an indispensible part of human life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex flow datasets are often difficult to represent in detail using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows (i.e., complex dynamics and time-dependent). In this paper, we review two popular texture-based techniques and their application to flow datasets sourced from real research projects. The texture-based techniques investigated were Line Integral Convolution (LIC), and Image-Based Flow Visualisation (IBFV). We evaluated these techniques and in this paper report on their visualisation effectiveness (when compared with traditional techniques), their ease of implementation, and their computational overhead.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prostate cancer (CaP) is the second leading cause of cancer-related deaths in North American males and the most common newly diagnosed cancer in men world wide. Biomarkers are widely used for both early detection and prognostic tests for cancer. The current, commonly used biomarker for CaP is serum prostate specific antigen (PSA). However, the specificity of this biomarker is low as its serum level is not only increased in CaP but also in various other diseases, with age and even body mass index. Human body fluids provide an excellent resource for the discovery of biomarkers, with the advantage over tissue/biopsy samples of their ease of access, due to the less invasive nature of collection. However, their analysis presents challenges in terms of variability and validation. Blood and urine are two human body fluids commonly used for CaP research, but their proteomic analyses are limited both by the large dynamic range of protein abundance making detection of low abundance proteins difficult and in the case of urine, by the high salt concentration. To overcome these challenges, different techniques for removal of high abundance proteins and enrichment of low abundance proteins are used. Their applications and limitations are discussed in this review. A number of innovative proteomic techniques have improved detection of biomarkers. They include two dimensional differential gel electrophoresis (2D-DIGE), quantitative mass spectrometry (MS) and functional proteomic studies, i.e., investigating the association of post translational modifications (PTMs) such as phosphorylation, glycosylation and protein degradation. The recent development of quantitative MS techniques such as stable isotope labeling with amino acids in cell culture (SILAC), isobaric tags for relative and absolute quantitation (iTRAQ) and multiple reaction monitoring (MRM) have allowed proteomic researchers to quantitatively compare data from different samples. 2D-DIGE has greatly improved the statistical power of classical 2D gel analysis by introducing an internal control. This chapter aims to review novel CaP biomarkers as well as to discuss current trends in biomarker research from two angles: the source of biomarkers (particularly human body fluids such as blood and urine), and emerging proteomic approaches for biomarker research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Detailed representations of complex flow datasets are often difficult to generate using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows. We review two popular texture based techniques and their application to flow datasets sourced from active research projects. The techniques investigated were Line integral convolution (LIC) [1], and Image based flow visualisation (IBFV) [18]. We evaluated these and report on their effectiveness from a visualisation perspective. We also report on their ease of implementation and computational overheads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mining environment presents a challenging prospect for stereo vision. Our objective is to produce a stereo vision sensor suited to close-range scenes consisting mostly of rocks. This sensor should produce a dense depth map within real-time constraints. Speed and robustness are of foremost importance for this application. This paper compares a number of stereo matching algorithms in terms of robustness and suitability to fast implementation. These include traditional area-based algorithms, and algorithms based on non-parametric transforms, notably the rank and census transforms. Our experimental results show that the rank and census transforms are robust with respect to radiometric distortion and introduce less computational complexity than conventional area-based matching techniques.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose. To compare radiological records of 90 consecutive patients who underwent cemented total hip arthroplasty (THA) with or without use of the Rim Cutter to prepare the acetabulum. Methods. The acetabulum of 45 patients was prepared using the Rim Cutter, whereas the device was not used in the other 45 patients. Postoperative radiographs were evaluated using a digital templating system to measure (1) the positions of the operated hips with respect to the normal, contralateral hips (the centre of rotation of the socket, the height of the centre of rotation from the teardrop, and lateralisation of the centre of rotation from the teardrop) and (2) the uniformity and width of the cement mantle in the 3 DeLee Charnley acetabular zones, and the number of radiolucencies in these zones. Results. The study group showed improved radiological parameters and were closer to the anatomic centre of rotation both vertically (1.5 vs. 3.7 mm, p<0.001) and horizontally (1.8 vs. 4.4 mm, p<0.001) and had consistently thicker and more uniform cement mantles (p<0.001). There were 2 radiolucent lines in the control group but none in the study group. Conclusion. The Rim Cutter resulted in more accurate placement of the centre of rotation of a cemented prosthetic socket, and produced a thicker, more congruent cement mantle with fewer radiolucent lines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sound tagging has been studied for years. Among all sound types, music, speech, and environmental sound are three hottest research areas. This survey aims to provide an overview about the state-of-the-art development in these areas.We discuss about the meaning of tagging in different sound areas at the beginning of the journey. Some examples of sound tagging applications are introduced in order to illustrate the significance of this research. Typical tagging techniques include manual, automatic, and semi-automatic approaches.After reviewing work in music, speech and environmental sound tagging, we compare them and state the research progress to date. Research gaps are identified for each research area and the common features and discriminations between three areas are discovered as well. Published datasets, tools used by researchers, and evaluation measures frequently applied in the analysis are listed. In the end, we summarise the worldwide distribution of countries dedicated to sound tagging research for years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The majority of distribution utilities do not have accurate information on the constituents of their loads. This information is very useful in managing and planning the network, adequately and economically. Customer loads are normally categorized in three main sectors: 1) residential; 2) industrial; and 3) commercial. In this paper, penalized least-squares regression and Euclidean distance methods are developed for this application to identify and quantify the makeup of a feeder load with unknown sectors/subsectors. This process is done on a monthly basis to account for seasonal and other load changes. The error between the actual and estimated load profiles are used as a benchmark of accuracy. This approach has shown to be accurate in identifying customer types in unknown load profiles, and is used in cross-validation of the results and initial assumptions.