882 resultados para INDIVIDUAL-BASED MODEL


Relevância:

50.00% 50.00%

Publicador:

Resumo:

We propose a simple speech music discriminator that uses features based on HILN(Harmonics, Individual Lines and Noise) model. We have been able to test the strength of the feature set on a standard database of 66 files and get an accuracy of around 97%. We also have tested on sung queries and polyphonic music and have got very good results. The current algorithm is being used to discriminate between sung queries and played (using an instrument like flute) queries for a Query by Humming(QBH) system currently under development in the lab.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Sub-pixel classification is essential for the successful description of many land cover (LC) features with spatial resolution less than the size of the image pixels. A commonly used approach for sub-pixel classification is linear mixture models (LMM). Even though, LMM have shown acceptable results, pragmatically, linear mixtures do not exist. A non-linear mixture model, therefore, may better describe the resultant mixture spectra for endmember (pure pixel) distribution. In this paper, we propose a new methodology for inferring LC fractions by a process called automatic linear-nonlinear mixture model (AL-NLMM). AL-NLMM is a three step process where the endmembers are first derived from an automated algorithm. These endmembers are used by the LMM in the second step that provides abundance estimation in a linear fashion. Finally, the abundance values along with the training samples representing the actual proportions are fed to multi-layer perceptron (MLP) architecture as input to train the neurons which further refines the abundance estimates to account for the non-linear nature of the mixing classes of interest. AL-NLMM is validated on computer simulated hyperspectral data of 200 bands. Validation of the output showed overall RMSE of 0.0089±0.0022 with LMM and 0.0030±0.0001 with the MLP based AL-NLMM, when compared to actual class proportions indicating that individual class abundances obtained from AL-NLMM are very close to the real observations.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We address the problem of multi-instrument recognition in polyphonic music signals. Individual instruments are modeled within a stochastic framework using Student's-t Mixture Models (tMMs). We impose a mixture of these instrument models on the polyphonic signal model. No a priori knowledge is assumed about the number of instruments in the polyphony. The mixture weights are estimated in a latent variable framework from the polyphonic data using an Expectation Maximization (EM) algorithm, derived for the proposed approach. The weights are shown to indicate instrument activity. The output of the algorithm is an Instrument Activity Graph (IAG), using which, it is possible to find out the instruments that are active at a given time. An average F-ratio of 0 : 7 5 is obtained for polyphonies containing 2-5 instruments, on a experimental test set of 8 instruments: clarinet, flute, guitar, harp, mandolin, piano, trombone and violin.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

As a comparative newly-invented PKM with over-constraints in kinematic chains, the Exechon has attracted extensive attention from the research society. Different from the well-recognized kinematics analysis, the research on the stiffness characteristics of the Exechon still remains as a challenge due to the structural complexity. In order to achieve a thorough understanding of the stiffness characteristics of the Exechon PKM, this paper proposed an analytical kinetostatic model by using the substructure synthesis technique. The whole PKM system is decomposed into a moving platform subsystem, three limb subsystems and a fixed base subsystem, which are connected to each other sequentially through corresponding joints. Each limb body is modeled as a spatial beam with a uniform cross-section constrained by two sets of lumped springs. The equilibrium equation of each individual limb assemblage is derived through finite element formulation and combined with that of the moving platform derived with Newtonian method to construct the governing kinetostatic equations of the system after introducing the deformation compatibility conditions between the moving platform and the limbs. By extracting the 6 x 6 block matrix from the inversion of the governing compliance matrix, the stiffness of the moving platform is formulated. The computation for the stiffness of the Exechon PKM at a typical configuration as well as throughout the workspace is carried out in a quick manner with a piece-by-piece partition algorithm. The numerical simulations reveal a strong position-dependency of the PKM's stiffness in that it is symmetric relative to a work plane due to structural features. At the last stage, the effects of some design variables such as structural, dimensional and stiffness parameters on system rigidity are investigated with the purpose of providing useful information for the structural optimization and performance enhancement of the Exechon PKM. It is worthy mentioning that the proposed methodology of stiffness modeling in this paper can also be applied to other overconstrained PKMs and can evaluate the global rigidity over workplace efficiently with minor revisions.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Ce mémoire de maîtrise présente une nouvelle approche non supervisée pour détecter et segmenter les régions urbaines dans les images hyperspectrales. La méthode proposée n ́ecessite trois étapes. Tout d’abord, afin de réduire le coût calculatoire de notre algorithme, une image couleur du contenu spectral est estimée. A cette fin, une étape de réduction de dimensionalité non-linéaire, basée sur deux critères complémentaires mais contradictoires de bonne visualisation; à savoir la précision et le contraste, est réalisée pour l’affichage couleur de chaque image hyperspectrale. Ensuite, pour discriminer les régions urbaines des régions non urbaines, la seconde étape consiste à extraire quelques caractéristiques discriminantes (et complémentaires) sur cette image hyperspectrale couleur. A cette fin, nous avons extrait une série de paramètres discriminants pour décrire les caractéristiques d’une zone urbaine, principalement composée d’objets manufacturés de formes simples g ́eométriques et régulières. Nous avons utilisé des caractéristiques texturales basées sur les niveaux de gris, la magnitude du gradient ou des paramètres issus de la matrice de co-occurrence combinés avec des caractéristiques structurelles basées sur l’orientation locale du gradient de l’image et la détection locale de segments de droites. Afin de réduire encore la complexité de calcul de notre approche et éviter le problème de la ”malédiction de la dimensionnalité” quand on décide de regrouper des données de dimensions élevées, nous avons décidé de classifier individuellement, dans la dernière étape, chaque caractéristique texturale ou structurelle avec une simple procédure de K-moyennes et ensuite de combiner ces segmentations grossières, obtenues à faible coût, avec un modèle efficace de fusion de cartes de segmentations. Les expérimentations données dans ce rapport montrent que cette stratégie est efficace visuellement et se compare favorablement aux autres méthodes de détection et segmentation de zones urbaines à partir d’images hyperspectrales.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A mathematical model has been developed which describes the hot deformation and recrystallization behavior of austenite using a single internal variable: dislocation density. The dislocation density is incorporated into equations describing the rate of recovery and recrystallization. In each case no distinction is made between static and dynamic events, and the model is able to simulate multideformation processes. The model is statistically based and tracks individual populations of the dislocation density during the work-hardening and softening phases. After tuning using available data the model gave an accurate prediction of the stress–strain behavior and the static recrystallization kinetics for C–Mn steels. The model correctly predicted the sensitivity of the post deformation recrystallization behavior to process variables such as strain, strain rate and temperature, even though data for this were not explicitly incorporated in the tuning data set. In particular, the post dynamic recrystallization (generally termed metadynamic recrystallization) was shown to be largely independent of strain and temperature, but a strong function of strain rate, as observed in published experimental work.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Universities commonly use individual teaching development as one of a suite of strategies to improve teaching and learning outcomes. This paper outlines an individual teaching development programme based on the tenets of solution-focused brief therapy (SFBT). The programme was trialled with a senior lecturer of a large third-year subject in an Australian university. The approach resulted in evidence of positive changes in teaching. The potential and wider application of this approach is considered.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In group decision making (GDM) problems, it is natural for decision makers (DMs) to provide different preferences and evaluations owing to varying domain knowledge and cultural values. When the number of DMs is large, a higher degree of heterogeneity is expected, and it is difficult to translate heterogeneous information into one unified preference without loss of context. In this aspect, the current GDM models face two main challenges, i.e., handling the complexity pertaining to the unification of heterogeneous information from a large number of DMs, and providing optimal solutions based on unification methods. This paper presents a new consensus-based GDM model to manage heterogeneous information. In the new GDM model, an aggregation of individual priority (AIP)-based aggregation mechanism, which is able to employ flexible methods for deriving each DM's individual priority and to avoid information loss caused by unifying heterogeneous information, is utilized to aggregate the individual preferences. To reach a consensus more efficiently, different revision schemes are employed to reward/penalize the cooperative/non-cooperative DMs, respectively. The temporary collective opinion used to guide the revision process is derived by aggregating only those non-conflicting opinions at each round of revision. In order to measure the consensus in a robust manner, a position-based dissimilarity measure is developed. Compared with the existing GDM models, the proposed GDM model is more effective and flexible in processing heterogeneous information. It can be used to handle different types of information with different degrees of granularity. Six types of information are exemplified in this paper, i.e., ordinal, interval, fuzzy number, linguistic, intuitionistic fuzzy set, and real number. The results indicate that the position-based consensus measure is able to overcome possible distortions of the results in large-scale GDM problems.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Background: Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. Aims: To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. Materials and Methods: One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. Results: A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. Conclusion: The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described. © Indian Journal of Dermatology 2013.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In order to achieve host cell entry, the apicomplexan parasite Neospora caninum relies on the contents of distinct organelles, named micronemes, rhoptries and dense granules, which are secreted at defined timepoints during and after host cell entry. It was shown previously that a vaccine composed of a mixture of three recombinant antigens, corresponding to the two microneme antigens NcMIC1 and NcMIC3 and the rhoptry protein NcROP2, prevented disease and limited cerebral infection and transplacental transmission in mice. In this study, we selected predicted immunogenic domains of each of these proteins and created four different chimeric antigens, with the respective domains incorporated into these chimers in different orders. Following vaccination, mice were challenged intraperitoneally with 2 × 10(6)N. caninum tachzyoites and were then carefully monitored for clinical symptoms during 4 weeks post-infection. Of the four chimeric antigens, only recNcMIC3-1-R provided complete protection against disease with 100% survivors, compared to 40-80% of survivors in the other groups. Serology did not show any clear differences in total IgG, IgG1 and IgG2a levels between the different treatment groups. Vaccination with all four chimeric variants generated an IL-4 biased cytokine expression, which then shifted to an IFN-γ-dominated response following experimental infection. Sera of recNcMIC3-1-R vaccinated mice reacted with each individual recombinant antigen, as well as with three distinct bands in Neospora extracts with similar Mr as NcMIC1, NcMIC3 and NcROP2, and exhibited distinct apical labeling in tachyzoites. These results suggest that recNcMIC3-1-R is an interesting chimeric vaccine candidate and should be followed up in subsequent studies in a fetal infection model.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

BACKGROUND: Short-acting agents for neuromuscular block (NMB) require frequent dosing adjustments for individual patient's needs. In this study, we verified a new closed-loop controller for mivacurium dosing in clinical trials. METHODS: Fifteen patients were studied. T1% measured with electromyography was used as input signal for the model-based controller. After induction of propofol/opiate anaesthesia, stabilization of baseline electromyography signal was awaited and a bolus of 0.3 mg kg-1 mivacurium was then administered to facilitate endotracheal intubation. Closed-loop infusion was started thereafter, targeting a neuromuscular block of 90%. Setpoint deviation, the number of manual interventions and surgeon's complaints were recorded. Drug use and its variability between and within patients were evaluated. RESULTS: Median time of closed-loop control for the 11 patients included in the data processing was 135 [89-336] min (median [range]). Four patients had to be excluded because of sensor problems. Mean absolute deviation from setpoint was 1.8 +/- 0.9 T1%. Neither manual interventions nor complaints from the surgeons were recorded. Mean necessary mivacurium infusion rate was 7.0 +/- 2.2 microg kg-1 min-1. Intrapatient variability of mean infusion rates over 30-min interval showed high differences up to a factor of 1.8 between highest and lowest requirement in the same patient. CONCLUSIONS: Neuromuscular block can precisely be controlled with mivacurium using our model-based controller. The amount of mivacurium needed to maintain T1% at defined constant levels differed largely between and within patients. Closed-loop control seems therefore advantageous to automatically maintain neuromuscular block at constant levels.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Purpose: A fully three-dimensional (3D) massively parallelizable list-mode ordered-subsets expectation-maximization (LM-OSEM) reconstruction algorithm has been developed for high-resolution PET cameras. System response probabilities are calculated online from a set of parameters derived from Monte Carlo simulations. The shape of a system response for a given line of response (LOR) has been shown to be asymmetrical around the LOR. This work has been focused on the development of efficient region-search techniques to sample the system response probabilities, which are suitable for asymmetric kernel models, including elliptical Gaussian models that allow for high accuracy and high parallelization efficiency. The novel region-search scheme using variable kernel models is applied in the proposed PET reconstruction algorithm. Methods: A novel region-search technique has been used to sample the probability density function in correspondence with a small dynamic subset of the field of view that constitutes the region of response (ROR). The ROR is identified around the LOR by searching for any voxel within a dynamically calculated contour. The contour condition is currently defined as a fixed threshold over the posterior probability, and arbitrary kernel models can be applied using a numerical approach. The processing of the LORs is distributed in batches among the available computing devices, then, individual LORs are processed within different processing units. In this way, both multicore and multiple many-core processing units can be efficiently exploited. Tests have been conducted with probability models that take into account the noncolinearity, positron range, and crystal penetration effects, that produced tubes of response with varying elliptical sections whose axes were a function of the crystal's thickness and angle of incidence of the given LOR. The algorithm treats the probability model as a 3D scalar field defined within a reference system aligned with the ideal LOR. Results: This new technique provides superior image quality in terms of signal-to-noise ratio as compared with the histogram-mode method based on precomputed system matrices available for a commercial small animal scanner. Reconstruction times can be kept low with the use of multicore, many-core architectures, including multiple graphic processing units. Conclusions: A highly parallelizable LM reconstruction method has been proposed based on Monte Carlo simulations and new parallelization techniques aimed at improving the reconstruction speed and the image signal-to-noise of a given OSEM algorithm. The method has been validated using simulated and real phantoms. A special advantage of the new method is the possibility of defining dynamically the cut-off threshold over the calculated probabilities thus allowing for a direct control on the trade-off between speed and quality during the reconstruction.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We present an approach for evaluating the efficacy of combination antitumor agent schedules that accounts for order and timing of drug administration. Our model-based approach compares in vivo tumor volume data over a time course and offers a quantitative definition for additivity of drug effects, relative to which synergism and antagonism are interpreted. We begin by fitting data from individual mice receiving at most one drug to a differential equation tumor growth/drug effect model and combine individual parameter estimates to obtain population statistics. Using two null hypotheses: (i) combination therapy is consistent with additivity or (ii) combination therapy is equivalent to treating with the more effective single agent alone, we compute predicted tumor growth trajectories and their distribution for combination treated animals. We illustrate this approach by comparing entire observed and expected tumor volume trajectories for a data set in which HER-2/neu-overexpressing MCF-7 human breast cancer xenografts are treated with a humanized, anti-HER-2 monoclonal antibody (rhuMAb HER-2), doxorubicin, or one of five proposed combination therapy schedules.