966 resultados para Curriculum approaches
Resumo:
Applications in various domains often lead to very large and frequently high-dimensional data. Successful algorithms must avoid the curse of dimensionality but at the same time should be computationally efficient. Finding useful patterns in large datasets has attracted considerable interest recently. The primary goal of the paper is to implement an efficient Hybrid Tree based clustering method based on CF-Tree and KD-Tree, and combine the clustering methods with KNN-Classification. The implementation of the algorithm involves many issues like good accuracy, less space and less time. We will evaluate the time and space efficiency, data input order sensitivity, and clustering quality through several experiments.
Resumo:
A major challenge in wireless communications is overcoming the deleterious effects of fading, a phenomenon largely responsible for the seemingly inevitable dropped call. Multiple-antennas communication systems, commonly referred to as MIMO systems, employ multiple antennas at both transmitter and receiver, thereby creating a multitude of signalling pathways between transmitter and receiver. These multiple pathways give the signal a diversity advantage with which to combat fading. Apart from helping overcome the effects of fading, MIMO systems can also be shown to provide a manyfold increase in the amount of information that can be transmitted from transmitter to receiver. Not surprisingly,MIMO has played, and continues to play, a key role in the advancement of wireless communication.Space-time codes are a reference to a signalling format in which information about the message is dispersed across both the spatial (or antenna) and time dimension. Algebraic techniques drawing from algebraic structures such as rings, fields and algebras, have been extensively employed in the construction of optimal space-time codes that enable the potential of MIMO communication to be realized, some of which have found their way into the IEEE wireless communication standards. In this tutorial article, reflecting the authors’interests in this area, we survey some of these techniques.
Resumo:
Results from elasto-plastic numerical simulations of jointed rocks using both the equivalent continuum and discrete continuum approaches are presented, and are compared with experimental measurements. Initially triaxial compression tests on different types of rocks with wide variation in the uniaxial compressive strength are simulated using both the approaches and the results are compared. The applicability and relative merits and limitations of both the approaches for the simulation of jointed rocks are discussed. It is observed that both the approaches are reasonably good in predicting the real response. However, the equivalent continuum approach has predicted somewhat higher stiffness values at low strains. Considering the modelling effort involved in case of discrete continuum approach, for problems with complex geometry, it is suggested that a proper equivalent continuum model can be used, without compromising much on the accuracy of the results. Then the numerical analysis of a tunnel in Japan is taken up using the continuum approach. The deformations predicted are compared well against the field measurements and the predictions from discontinuum analysis. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The titled approaches were effected with various 2-substituted benzoylacetic acid oximes 3 (Beckmann) and 2-substituted malonamic acids 9 (Hofmann), their carboxyl groups being masked as a 2,4,10-trioxaadamantane unit (an orthoacetate). The oxime mesylates have been rearranged with basic Al2O3 in refluxing CHCl3, and the malonamic acids with phenyliodoso acetate and KOH/MeOH. Both routes are characterized by excellent overall yields. Structure confirmation of final products was conducted with X-ray diffraction in selected cases. The final N-benzoyl and N-(methoxycarbonyl) products are alpha-amino acids with both carboxyl and amino protection; hence, they are of great interest in peptide synthesis.
Assessment of seismic hazard and liquefaction potential of Gujarat based on probabilistic approaches
Resumo:
Gujarat is one of the fastest-growing states of India with high industrial activities coming up in major cities of the state. It is indispensable to analyse seismic hazard as the region is considered to be most seismically active in stable continental region of India. The Bhuj earthquake of 2001 has caused extensive damage in terms of causality and economic loss. In the present study, the seismic hazard of Gujarat evaluated using a probabilistic approach with the use of logic tree framework that minimizes the uncertainties in hazard assessment. The peak horizontal acceleration (PHA) and spectral acceleration (Sa) values were evaluated for 10 and 2 % probability of exceedance in 50 years. Two important geotechnical effects of earthquakes, site amplification and liquefaction, are also evaluated, considering site characterization based on site classes. The liquefaction return period for the entire state of Gujarat is evaluated using a performance-based approach. The maps of PHA and PGA values prepared in this study are very useful for seismic hazard mitigation of the region in future.
Resumo:
Introduction: Advances in genomics technologies are providing a very large amount of data on genome-wide gene expression profiles, protein molecules and their interactions with other macromolecules and metabolites. Molecular interaction networks provide a useful way to capture this complex data and comprehend it. Networks are beginning to be used in drug discovery, in many steps of the modern discovery pipeline, with large-scale molecular networks being particularly useful for the understanding of the molecular basis of the disease. Areas covered: The authors discuss network approaches used for drug target discovery and lead identification in the drug discovery pipeline. By reconstructing networks of targets, drugs and drug candidates as well as gene expression profiles under normal and disease conditions, the paper illustrates how it is possible to find relationships between different diseases, find biomarkers, explore drug repurposing and study emergence of drug resistance. Furthermore, the authors also look at networks which address particular important aspects such as off-target effects, combination-targets, mechanism of drug action and drug safety. Expert opinion: The network approach represents another paradigm shift in drug discovery science. A network approach provides a fresh perspective of understanding important proteins in the context of their cellular environments, providing a rational basis for deriving useful strategies in drug design. Besides drug target identification and inferring mechanism of action, networks enable us to address new ideas that could prove to be extremely useful for new drug discovery, such as drug repositioning, drug synergy, polypharmacology and personalized medicine.
Resumo:
For one-dimensional flexible objects such as ropes, chains, hair, the assumption of constant length is realistic for large-scale 3D motion. Moreover, when the motion or disturbance at one end gradually dies down along the curve defining the one-dimensional flexible objects, the motion appears ``natural''. This paper presents a purely geometric and kinematic approach for deriving more natural and length-preserving transformations of planar and spatial curves. Techniques from variational calculus are used to determine analytical conditions and it is shown that the velocity at any point on the curve must be along the tangent at that point for preserving the length and to yield the feature of diminishing motion. It is shown that for the special case of a straight line, the analytical conditions lead to the classical tractrix curve solution. Since analytical solutions exist for a tractrix curve, the motion of a piecewise linear curve can be solved in closed-form and thus can be applied for the resolution of redundancy in hyper-redundant robots. Simulation results for several planar and spatial curves and various input motions of one end are used to illustrate the features of motion damping and eventual alignment with the perturbation vector.
Resumo:
Backgrond: Muscular dystrophies consist of a number of juvenile and adult forms of complex disorders which generally cause weakness or efficiency defects affecting skeletal muscles or, in some kinds, other types of tissues in all parts of the body are vastly affected. In previous studies, it was observed that along with muscular dystrophy, immune inflammation was caused by inflammatory cells invasion - like T lymphocyte markers (CD8+/CD4+). Inflammatory processes play a major part in muscular fibrosis in muscular dystrophy patients. Additionally, a significant decrease in amounts of two myogenic recovery factors (myogenic differentation 1 MyoD] and myogenin) in animal models was observed. The drug glatiramer acetate causes anti-inflammatory cytokines to increase and T helper (Th) cells to induce, in an as yet unknown mechanism. MyoD recovery activity in muscular cells justifies using it alongside this drug. Methods: In this study, a nanolipodendrosome carrier as a drug delivery system was designed. The purpose of the system was to maximize the delivery and efficiency of the two drug factors, MyoD and myogenin, and introduce them as novel therapeutic agents in muscular dystrophy phenotypic mice. The generation of new muscular cells was analyzed in SW1 mice. Then, immune system changes and probable side effects after injecting the nanodrug formulations were investigated. Results: The loaded lipodendrimer nanocarrier with the candidate drug, in comparison with the nandrolone control drug, caused a significant increase in muscular mass, a reduction in CD4+/CD8+ inflammation markers, and no significant toxicity was observed. The results support the hypothesis that the nanolipodendrimer containing the two candidate drugs will probably be an efficient means to ameliorate muscular degeneration, and warrants further investigation.
Resumo:
Cotton is a widely used raw material for textiles but drawbacks regarding their poor mechanical properties often limit their applications as functional materials. The present investigation involved process development for one step coating of cotton with silver nanoparticles (SNP) synthesized using Azadirachta indica and Citrus limon extract to develop functional textiles. Addition of starch to functional textiles led to efficient binding of nanoparticles to fabric and led to drastic decrease in release of silver from fabricated textiles after ten washing cycles enhancing their environment friendliness. Differential scanning calorimetry, scanning electron microscopy, FT-IR analysis and mechanical studies demonstrated efficient binding of nanoparticles to fabric through bio-based processes. The functionalized textiles developed by the bio-based methods showed significant antibacterial activity against E. coli and S. aureus (with 99% microbial reduction). Present work offers a simple procedure for coating SNP using bio-based approaches with promising applications in specialized functions.
Resumo:
The tonic is a fundamental concept in Indian art music. It is the base pitch, which an artist chooses in order to construct the melodies during a rg(a) rendition, and all accompanying instruments are tuned using the tonic pitch. Consequently, tonic identification is a fundamental task for most computational analyses of Indian art music, such as intonation analysis, melodic motif analysis and rg recognition. In this paper we review existing approaches for tonic identification in Indian art music and evaluate them on six diverse datasets for a thorough comparison and analysis. We study the performance of each method in different contexts such as the presence/absence of additional metadata, the quality of audio data, the duration of audio data, music tradition (Hindustani/Carnatic) and the gender of the singer (male/female). We show that the approaches that combine multi-pitch analysis with machine learning provide the best performance in most cases (90% identification accuracy on average), and are robust across the aforementioned contexts compared to the approaches based on expert knowledge. In addition, we also show that the performance of the latter can be improved when additional metadata is available to further constrain the problem. Finally, we present a detailed error analysis of each method, providing further insights into the advantages and limitations of the methods.