979 resultados para ranking method
Resumo:
For wind farm optimizations with lands belonging to different owners, the traditional penalty method is highly dependent on the type of wind farm land division. The application of the traditional method can be cumbersome if the divisions are complex. To overcome this disadvantage, a new method is proposed in this paper for the first time. Unlike the penalty method which requires the addition of penalizing term when evaluating the fitness function, it is achieved through repairing the infeasible solutions before fitness evaluation. To assess the effectiveness of the proposed method on the optimization of wind farm, the optimizing results of different methods are compared for three different types of wind farm division. Different wind scenarios are also incorporated during optimization which includes (i) constant wind speed and wind direction; (ii) various wind speed and wind direction, and; (iii) the more realisticWeibull distribution. Results show that the performance of the new method varies for different land plots in the tested cases. Nevertheless, it is found that optimum or at least close to optimum results can be obtained with sequential land plot study using the new method for all cases. It is concluded that satisfactory results can be achieved using the proposed method. In addition, it has the advantage of flexibility in managing the wind farm design, which not only frees users to define the penalty parameter but without limitations on the wind farm division.
Resumo:
With the extensive use of rating systems in the web, and their significance in decision making process by users, the need for more accurate aggregation methods has emerged. The Naïve aggregation method, using the simple mean, is not adequate anymore in providing accurate reputation scores for items [6 ], hence, several researches where conducted in order to provide more accurate alternative aggregation methods. Most of the current reputation models do not consider the distribution of ratings across the different possible ratings values. In this paper, we propose a novel reputation model, which generates more accurate reputation scores for items by deploying the normal distribution over ratings. Experiments show promising results for our proposed model over state-of-the-art ones on sparse and dense datasets.
Resumo:
Not a lot is known about most mental illness. Its triggers can rarely be established and nor can its aetiological dynamics, so it is hardly surprising that the accepted treatments for most mental illnesses are really strategies to manage the most overt symptoms. But with such a dearth of knowledge, how can worthy decisions be made about psychiatric interventions, especially given time and budgetary restrictions? This paper introduces a method, extrapolated from Salutogenics; the psycho-social theory of health introduced by Antonovsky in 1987. This method takes a normative stance (that psychiatric health care is for the betterment of psychiatric patients), and applies it to any context where there is a dearth of workable knowledge. In lieu of guiding evidence, the method identifies reasonable alternatives on the fly, enabling rational decisions to be made quickly with limited resources.
Resumo:
This paper deals with a finite element modelling method for thin layer mortared masonry systems. In this method, the mortar layers including the interfaces are represented using a zero thickness interface element and the masonry units are modelled using an elasto-plastic, damaging solid element. The interface element is formulated using two regimes; i) shear-tension and ii) shearcompression. In the shear-tension regime, the failure of joint is consiedered through an eliptical failure criteria and in shear-compression it is considered through Mohr Coulomb type failure criterion. An explicit integration scheme is used in an implicit finite element framework for the formulation of the interface element. The model is calibrated with an experimental dataset from thin layer mortared masonry prism subjected to uniaxial compression, a triplet subjected to shear loads a beam subjected to flexural loads and used to predict the response of thin layer mortared masonry wallettes under orthotropic loading. The model is found to simulate the behaviour of a thin layer mortated masonry shear wall tested under pre-compression and inplane shear quite adequately. The model is shown to reproduce the failure of masonry panels under uniform biaxial state of stresses.
Resumo:
We incorporated a new Riemannian fluid registration algorithm into a general MRI analysis method called tensor-based morphometry to map the heritability of brain morphology in MR images from 23 monozygotic and 23 dizygotic twin pairs. All 92 3D scans were fluidly registered to a common template. Voxelwise Jacobian determinants were computed from the deformation fields to assess local volumetric differences across subjects. Heritability maps were computed from the intraclass correlations and their significance was assessed using voxelwise permutation tests. Lobar volume heritability was also studied using the ACE genetic model. The performance of this Riemannian algorithm was compared to a more standard fluid registration algorithm: 3D maps from both registration techniques displayed similar heritability patterns throughout the brain. Power improvements were quantified by comparing the cumulative distribution functions of the p-values generated from both competing methods. The Riemannian algorithm outperformed the standard fluid registration.
Resumo:
In structural brain MRI, group differences or changes in brain structures can be detected using Tensor-Based Morphometry (TBM). This method consists of two steps: (1) a non-linear registration step, that aligns all of the images to a common template, and (2) a subsequent statistical analysis. The numerous registration methods that have recently been developed differ in their detection sensitivity when used for TBM, and detection power is paramount in epidemological studies or drug trials. We therefore developed a new fluid registration method that computes the mappings and performs statistics on them in a consistent way, providing a bridge between TBM registration and statistics. We used the Log-Euclidean framework to define a new regularizer that is a fluid extension of the Riemannian elasticity, which assures diffeomorphic transformations. This regularizer constrains the symmetrized Jacobian matrix, also called the deformation tensor. We applied our method to an MRI dataset from 40 fraternal and identical twins, to revealed voxelwise measures of average volumetric differences in brain structure for subjects with different degrees of genetic resemblance.
Resumo:
An automated method for extracting brain volumes from three commonly acquired three-dimensional (3D) MR images (proton density, T1 weighted, and T2-weighted) of the human head is described. The procedure is divided into four levels: preprocessing, segmentation, scalp removal, and postprocessing. A user-provided reference point is the sole operator-dependent input required. The method's parameters were first optimized and then fixed and applied to 30 repeat data sets from 15 normal older adult subjects to investigate its reproducibility. Percent differences between total brain volumes (TBVs) for the subjects' repeated data sets ranged from .5% to 2.2%. We conclude that the method is both robust and reproducible and has the potential for wide application.
Resumo:
As connectivity analyses become more popular, claims are often made about how the brain's anatomical networks depend on age, sex, or disease. It is unclear how results depend on tractography methods used to compute fiber networks. We applied 11 tractography methods to high angular resolution diffusion images of the brain (4-Tesla 105-gradient HARDI) from 536 healthy young adults. We parcellated 70 cortical regions, yielding 70×70 connectivity matrices, encoding fiber density. We computed popular graph theory metrics, including network efficiency, and characteristic path lengths. Both metrics were robust to the number of spherical harmonics used to model diffusion (4th-8th order). Age effects were detected only for networks computed with the probabilistic Hough transform method, which excludes smaller fibers. Sex and total brain volume affected networks measured with deterministic, tensor-based fiber tracking but not with the Hough method. Each tractography method includes different fibers, which affects inferences made about the reconstructed networks.
Resumo:
Modal flexibility is a widely accepted technique to detect structural damage using vibration characteristics. Its application to detect damage in long span large diameter cables such as those used in suspension bridge main cables has not received much attention. This paper uses the modal flexibility method incorporating two damage indices (DIs) based on lateral and vertical modes to localize damage in such cables. The competency of those DIs in damage detection is tested by the numerically obtained vibration characteristics of a suspended cable in both intact and damaged states. Three single damage cases and one multiple damage case are considered. The impact of random measurement noise in the modal data on the damage localization capability of these two DIs is next examined. Long span large diameter cables are characterized by the two critical cable parameters named bending stiffness and sag-extensibility. The influence of these parameters in the damage localization capability of the two DIs is evaluated by a parametric study with two single damage cases. Results confirm that the damage index based on lateral vibration modes has the ability to successfully detect and locate damage in suspended cables with 5% noise in modal data for a range of cable parameters. This simple approach therefore can be extended for timely damage detection in cables of suspension bridges and thereby enhance their service during their life spans.
Resumo:
Pure phase Cu2ZnSnS4 (CZTS) nanoparticles were successfully synthesized via polyacrylic acid (PAA) assisted one-pot hydrothermal route. The morphology, crystal structure, composition and optical properties as well as the photoactivity of the as-synthesized CZTS nanoparticles were characterized by X-ray diffraction, Raman spectroscopy, scanning electron microscopy, transmission electron microscopy, X-ray photoelectron spectrometer, UV-visible absorption spectroscopy and photoelectrochemical measurement. The influence of various synthetic conditions, such as the reaction temperature, reaction duration and the amount of PAA in the precursor solution on the formation of CZTS compound was systematically investigated. The results have shown that the crystal phase, morphology and particle size of CZTS can be tailored by controlling the reaction conditions. The formation mechanism of CZTS in the hydrothermal reaction has been proposed based on the investigation of time-dependent phase evolution of CZTS which showed that metal sulfides (e.g., Cu2S, SnS2 and ZnS) were formed firstly during the hydrothermal reaction before forming CZTS compound through nucleation. The band gap of the as-synthesized CZTS nanoparticles is 1.49 eV. The thin film electrode based on the synthesized CZTS nanoparticles in a three-electrode photoelectrochemical cell generated pronounced photocurrent under illumination provided by a red light-emitting diode (LED, 627 nm), indicating the photoactivity of the semiconductor material.
Resumo:
The size and arrangement of stromal collagen fibrils (CFs) influence the optical properties of the cornea and hence its function. The spatial arrangement of the collagen is still questionable in relation to the diameter of collagen fibril. In the present study, we introduce a new parameter, edge-fibrillar distance (EFD) to measure how two collagen fibrils are spaced with respect to their closest edges and their spatial distribution through normalized standard deviation of EFD (NSDEFD) accessed through the application of two commercially available multipurpose solutions (MPS): ReNu and Hippia. The corneal buttons were soaked separately in ReNu and Hippia MPS for five hours, fixed overnight in 2.5% glutaraldehyde containing cuprolinic blue and processed for transmission electron microscopy. The electron micrographs were processed using ImageJ user-coded plugin. Statistical analysis was performed to compare the image processed equivalent diameter (ED), inter-fibrillar distance (IFD), and EFD of the CFs of treated versus normal corneas. The ReNu-soaked cornea resulted in partly degenerated epithelium with loose hemidesmosomes and Bowman’s collagen. In contrast, the epithelium of the cornea soaked in Hippia was degenerated or lost but showed closely packed Bowman’s collagen. Soaking the corneas in both MPS caused a statistically significant decrease in the anterior collagen fibril, ED and a significant change in IFD, and EFD than those of the untreated corneas (p < 0.05, for all comparisons). The introduction of EFD measurement in the study directly provided a sense of gap between periphery of the collagen bundles, their spatial distribution; and in combination with ED, they showed how the corneal collagen bundles are spaced in relation to their diameters. The spatial distribution parameter NSDEFD indicated that ReNu treated cornea fibrils were uniformly distributed spatially, followed by normal and Hippia. The EFD measurement with relatively lower standard deviation and NSDEFD, a characteristic of uniform CFs distribution, can be an additional parameter used in evaluating collagen organization and accessing the effects of various treatments on corneal health and transparency.
Resumo:
Modelling fluvial processes is an effective way to reproduce basin evolution and to recreate riverbed morphology. However, due to the complexity of alluvial environments, deterministic modelling of fluvial processes is often impossible. To address the related uncertainties, we derive a stochastic fluvial process model on the basis of the convective Exner equation that uses the statistics (mean and variance) of river velocity as input parameters. These statistics allow for quantifying the uncertainty in riverbed topography, river discharge and position of the river channel. In order to couple the velocity statistics and the fluvial process model, the perturbation method is employed with a non-stationary spectral approach to develop the Exner equation as two separate equations: the first one is the mean equation, which yields the mean sediment thickness, and the second one is the perturbation equation, which yields the variance of sediment thickness. The resulting solutions offer an effective tool to characterize alluvial aquifers resulting from fluvial processes, which allows incorporating the stochasticity of the paleoflow velocity.
Resumo:
This study explores the potential use of empty fruit bunch (EFB) residues from palm oil processing residues, as an alternative feedstock for microbial oil production. EFB is a readily available, lignocellulosic biomass that provides cheaper substrates for oil production in comparison to the use of pure sugars. In this study, potential oleaginous microorganisms were selected based on a multi-criteria analysis (MCA) framework which utilised Analytical Hierarchy Process (AHP) with Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) aided by Geometrical Analysis for Interactive Aid (GAIA). The MCA framework was used to evaluate several strains of microalgae (Chlorella protothecoides and Chlorella zofingiensis), yeasts (Cryptococcus albidus and Rhodotorula mucilaginosa) and fungi (Aspergillus oryzae and Mucor plumbeus) on glucose, xylose and glycerol. Based on the results of PROMETHEE rankings and GAIA plane, fungal strains A. oryzae and M. plumbeus and yeast strain R. mucilaginosa showed great promise for oil production from lignocellulosic hydrolysates. The study further cultivated A. oryzae, M. plumbeus and R. mucilaginosa on EFB hydrolysates for oil production. EFB was pretreated with dilute sulfuric acid, followed by enzymatic saccharification of solid residue. Hydrolysates tested in this study are detoxified liquid hydrolysates (LH) and enzymatic hydrolysate (EH).
Resumo:
This research investigated the use of DNA fingerprinting to characterise the bacteria Streptococcus pneumoniae or pneumococcus, and hence gain insight into the development of new vaccines or antibiotics. Different bacterial DNA fingerprinting methods were studied, and a novel method was developed and validated, which characterises different cell coatings that pneumococci produce. This method was used to study the epidemiology of pneumococci in Queensland before and after the introduction of the current pneumococcal vaccine. This study demonstrated that pneumococcal disease is highly prevalent in children under four years, that the bacteria can `switch' its cell coating to evade the vaccine, and that some DNA fingerprinting methods are more discriminatory than others. This has an impact on understanding which strains are more prone to cause invasive disease. Evidence of the excellent research findings have been published in high impact internationally refereed journals.