974 resultados para highly trained runners
Resumo:
Aim of the study was to determine distribution and depletion patterns of intramyocellular lipids (IMCL) in leg muscles before and after two types of standardized endurance exercise. ¹H-magnetic resonance spectroscopic imaging was performed (1) in the thigh of eight-trained cyclists after exercising on an ergometer for 3 h at 52 ± 8% of maximal speed and (2) in the lower leg of eight-trained runners after exercising on a treadmill for 3 h at 49 ± 3% of maximal workload. Pre-exercise IMCL contents were reduced postexercise in 11 out of 13 investigated upper and lower leg muscles (P < 0.015 for all). A strong linear correlation with a slope of ∼0.5 between pre-exercise IMCL content and IMCL depletion was found. IMCL depletion differed strongly between muscles. Absolute and also relative IMCL reduction was significantly higher in muscles with predominantly slow fibers compared to those with fast fibers. Creatine levels and fiber orientation were stable and unchanged after exercise, while trimethyl-ammonium groups increased. This is presented in the accompanying paper. In conclusion, a systematic comparison of metabolic changes in cross sections of the upper and lower leg was performed. The results imply that pre-exercise IMCL levels determine the degree of IMCL depletion after exercise.
Resumo:
To compare the 1-year cost-effectiveness of therapeutic assertive community treatment (ACT) with standard care in schizophrenia. ACT was specifically developed for patients with schizophrenia, delivered by psychosis experts highly trained in respective psychotherapies, and embedded into an integrated care system.
Resumo:
Utilizing advanced information technology, Intensive Care Unit (ICU) remote monitoring allows highly trained specialists to oversee a large number of patients at multiple sites on a continuous basis. In the current research, we conducted a time-motion study of registered nurses’ work in an ICU remote monitoring facility. Data were collected on seven nurses through 40 hours of observation. The results showed that nurses’ essential tasks were centered on three themes: monitoring patients, maintaining patients’ health records, and managing technology use. In monitoring patients, nurses spent 52% of the time assimilating information embedded in a clinical information system and 15% on monitoring live vitals. System-generated alerts frequently interrupted nurses in their task performance and redirected them to manage suddenly appearing events. These findings provide insight into nurses’ workflow in a new, technology-driven critical care setting and have important implications for system design, work engineering, and personnel selection and training.
Resumo:
Rhythm is a central characteristic of music and speech, the most important domains of human communication using acoustic signals. Here, we investigated how rhythmical patterns in music are processed in the human brain, and, in addition, evaluated the impact of musical training on rhythm processing. Using fMRI, we found that deviations from a rule-based regular rhythmic structure activated the left planum temporale together with Broca's area and its right-hemispheric homolog across subjects, that is, a network also crucially involved in the processing of harmonic structure in music and the syntactic analysis of language. Comparing the BOLD responses to rhythmic variations between professional jazz drummers and musical laypersons, we found that only highly trained rhythmic experts show additional activity in left-hemispheric supramarginal gyrus, a higher-order region involved in processing of linguistic syntax. This suggests an additional functional recruitment of brain areas usually dedicated to complex linguistic syntax processing for the analysis of rhythmical patterns only in professional jazz drummers, who are especially trained to use rhythmical cues for communication.
Resumo:
La actividad física puede modificar la composición corporal y la mineralización ósea. Se compararon ambas variables en gimnastas femeninas de competición y controles apareados por edad y sexo que realizaban gimnasia recreativa (n = 12 en cada grupo; edades 9 a 14 años). La composición corporal se evaluó por métodos antropométricos y densitometría de rayos X (DXA). El contenido mineral y la densidad mineral óseas se midieron por DXA en cuerpo entero y columna lumbar. La ingesta de calcio se estimó por encuesta.
Resumo:
El estudio de materiales, especialmente biológicos, por medios no destructivos está adquiriendo una importancia creciente tanto en las aplicaciones científicas como industriales. Las ventajas económicas de los métodos no destructivos son múltiples. Existen numerosos procedimientos físicos capaces de extraer información detallada de las superficie de la madera con escaso o nulo tratamiento previo y mínima intrusión en el material. Entre los diversos métodos destacan las técnicas ópticas y las acústicas por su gran versatilidad, relativa sencillez y bajo coste. Esta tesis pretende establecer desde la aplicación de principios simples de física, de medición directa y superficial, a través del desarrollo de los algoritmos de decisión mas adecuados basados en la estadística, unas soluciones tecnológicas simples y en esencia, de coste mínimo, para su posible aplicación en la determinación de la especie y los defectos superficiales de la madera de cada muestra tratando, en la medida de lo posible, no alterar su geometría de trabajo. Los análisis desarrollados han sido los tres siguientes: El primer método óptico utiliza las propiedades de la luz dispersada por la superficie de la madera cuando es iluminada por un laser difuso. Esta dispersión produce un moteado luminoso (speckle) cuyas propiedades estadísticas permiten extraer propiedades muy precisas de la estructura tanto microscópica como macroscópica de la madera. El análisis de las propiedades espectrales de la luz laser dispersada genera ciertos patrones mas o menos regulares relacionados con la estructura anatómica, composición, procesado y textura superficial de la madera bajo estudio que ponen de manifiesto características del material o de la calidad de los procesos a los que ha sido sometido. El uso de este tipo de láseres implica también la posibilidad de realizar monitorizaciones de procesos industriales en tiempo real y a distancia sin interferir con otros sensores. La segunda técnica óptica que emplearemos hace uso del estudio estadístico y matemático de las propiedades de las imágenes digitales obtenidas de la superficie de la madera a través de un sistema de scanner de alta resolución. Después de aislar los detalles mas relevantes de las imágenes, diversos algoritmos de clasificacion automatica se encargan de generar bases de datos con las diversas especies de maderas a las que pertenecían las imágenes, junto con los márgenes de error de tales clasificaciones. Una parte fundamental de las herramientas de clasificacion se basa en el estudio preciso de las bandas de color de las diversas maderas. Finalmente, numerosas técnicas acústicas, tales como el análisis de pulsos por impacto acústico, permiten complementar y afinar los resultados obtenidos con los métodos ópticos descritos, identificando estructuras superficiales y profundas en la madera así como patologías o deformaciones, aspectos de especial utilidad en usos de la madera en estructuras. La utilidad de estas técnicas esta mas que demostrada en el campo industrial aun cuando su aplicación carece de la suficiente expansión debido a sus altos costes y falta de normalización de los procesos, lo cual hace que cada análisis no sea comparable con su teórico equivalente de mercado. En la actualidad gran parte de los esfuerzos de investigación tienden a dar por supuesto que la diferenciación entre especies es un mecanismo de reconocimiento propio del ser humano y concentran las tecnologías en la definición de parámetros físicos (módulos de elasticidad, conductividad eléctrica o acústica, etc.), utilizando aparatos muy costosos y en muchos casos complejos en su aplicación de campo. Abstract The study of materials, especially the biological ones, by non-destructive techniques is becoming increasingly important in both scientific and industrial applications. The economic advantages of non-destructive methods are multiple and clear due to the related costs and resources necessaries. There are many physical processes capable of extracting detailed information on the wood surface with little or no previous treatment and minimal intrusion into the material. Among the various methods stand out acoustic and optical techniques for their great versatility, relative simplicity and low cost. This thesis aims to establish from the application of simple principles of physics, surface direct measurement and through the development of the more appropriate decision algorithms based on statistics, a simple technological solutions with the minimum cost for possible application in determining the species and the wood surface defects of each sample. Looking for a reasonable accuracy without altering their work-location or properties is the main objetive. There are three different work lines: Empirical characterization of wood surfaces by means of iterative autocorrelation of laser speckle patterns: A simple and inexpensive method for the qualitative characterization of wood surfaces is presented. it is based on the iterative autocorrelation of laser speckle patterns produced by diffuse laser illumination of the wood surfaces. The method exploits the high spatial frequency content of speckle images. A similar approach with raw conventional photographs taken with ordinary light would be very difficult. A few iterations of the algorithm are necessary, typically three or four, in order to visualize the most important periodic features of the surface. The processed patterns help in the study of surface parameters, to design new scattering models and to classify the wood species. Fractal-based image enhancement techniques inspired by differential interference contrast microscopy: Differential interference contrast microscopy is a very powerful optical technique for microscopic imaging. Inspired by the physics of this type of microscope, we have developed a series of image processing algorithms aimed at the magnification, noise reduction, contrast enhancement and tissue analysis of biological samples. These algorithms use fractal convolution schemes which provide fast and accurate results with a performance comparable to the best present image enhancement algorithms. These techniques can be used as post processing tools for advanced microscopy or as a means to improve the performance of less expensive visualization instruments. Several examples of the use of these algorithms to visualize microscopic images of raw pine wood samples with a simple desktop scanner are provided. Wood species identification using stress-wave analysis in the audible range: Stress-wave analysis is a powerful and flexible technique to study mechanical properties of many materials. We present a simple technique to obtain information about the species of wood samples using stress-wave sounds in the audible range generated by collision with a small pendulum. Stress-wave analysis has been used for flaw detection and quality control for decades, but its use for material identification and classification is less cited in the literature. Accurate wood species identification is a time consuming task for highly trained human experts. For this reason, the development of cost effective techniques for automatic wood classification is a desirable goal. Our proposed approach is fully non-invasive and non-destructive, reducing significantly the cost and complexity of the identification and classification process.
Resumo:
Os mapas conceituais são ferramentas gráficas que possibilitam a representação dos modelos mentais do aluno. Devido a essa capacidade, o mapa conceitual pode ser utilizado como ferramenta avaliativa de conhecimento. O uso dessa ferramenta em sala de aula gera cargas na memória de trabalho que podem ser referentes ao conteúdo (carga intrínseca) ou à forma como esse recurso está sendo trabalhado na sala de aula (carga extrínseca). Este trabalho tem por objetivo investigar como os mapas conceituais auxiliam na avaliação da aprendizagem da disciplina Ciências da Natureza: Ciência, Cultura e Sociedade ofertada na Universidade de São Paulo para alunos ingressantes de diversos cursos de graduação (n = 64) durante o período de 2013.1 sobre o conteúdo de Mudanças Climáticas. Nas turmas avaliadas o mapa conceitual poderia ser utilizado como ferramenta de preparação para prova (MCPREP) como também era utilizado como parte da avaliação formal da disciplina (MC-AVAL). Essa pesquisa se constitui de três Estudos que investigam: 1. As diferenças de perfis de mapas conceituais obtidos na condição de MC-PREP x MCAVAL; 2. Se o grupo de alunos que fizeram MC-PREP tiveram MC-AVAL com perfil diferente daqueles que não fizeram; 3. A existência de uma correlação entre as características do MC-AVAL com o conhecimento declarativo dos estudantes. Os mapas conceituais foram avaliados considerando aspectos estruturais e semânticos. As metodologias das análises realizadas foram retiradas de trabalhos presente na literatura. Entre os aspectos semânticos estavam o uso de materiais instrucionais, a natureza das proposições, a presença de erros e a aderência a pergunta focal. Além dessas análises, foi verificada a presença de agrupamentos naturais que pudessem ser explicados com uso das categorias teóricas. A análise dos resultados indica que os mapas conceituais das condições MC-PREP e MC-AVAL são bastante distintas entre si, pois atendiam a objetivos educacionais diferentes. No Estudo 2 verificou-se que o MC-PREP não influenciou fortemente a elaboração do MC-AVAL, pois a diminuição da carga extrínseca provocada pela elaboração de MC-PREP não foi suficiente para mostrar diferenças entre os grupos, já que ambos estavam altamente treinados na técnica de mapeamento conceitual. Por fim, no Estudo 3, o principal fator correlacionando o conhecimento declarativo com a complexidade dos mapas conceituais foi o percentual de proposições apropriadas. As conclusões do trabalho são que a demanda de elaboração do mapa conceitual é realmente orientadora de produto, a elaboração de um mapa conceitual a mais em um grupo bem treinado na técnica não altera seu desempenho e o principal meio de avaliar os mapas conceituais deve ser a leitura de suas proposições constituintes.
Resumo:
Changes in modern structural design have created a demand for products which are light but possess high strength. The objective is a reduction in fuel consumption and weight of materials to satisfy both economic and environmental criteria. Cold roll forming has the potential to fulfil this requirement. The bending process is controlled by the shape of the profile machined on the periphery of the rolls. A CNC lathe can machine complicated profiles to a high standard of precision, but the expertise of a numerical control programmer is required. A computer program was developed during this project, using the expert system concept, to calculate tool paths and consequently to expedite the procurement of the machine control tapes whilst removing the need for a skilled programmer. Codifying the expertise of a human and the encapsulation of knowledge within a computer memory, destroys the dependency on highly trained people whose services can be costly, inconsistent and unreliable. A successful cold roll forming operation, where the product is geometrically correct and free from visual defects, is not easy to attain. The geometry of the sheet after travelling through the rolling mill depends on the residual strains generated by the elastic-plastic deformation. Accurate evaluation of the residual strains can provide the basis for predicting the geometry of the section. A study of geometric and material non-linearity, yield criteria, material hardening and stress-strain relationships was undertaken in this research project. The finite element method was chosen to provide a mathematical model of the bending process and, to ensure an efficient manipulation of the large stiffness matrices, the frontal solution was applied. A series of experimental investigations provided data to compare with corresponding values obtained from the theoretical modelling. A computer simulation, capable of predicting that a design will be satisfactory prior to the manufacture of the rolls, would allow effort to be concentrated into devising an optimum design where costs are minimised.
Resumo:
Medicine has changed in recent years. Medicare will all of its rules and regulations, worker's compensation laws, managed care and the trend toward more and larger group practices all contributed to the creation of an extremely structured regulatory environment which in turn demanded highly trained medical administrative assistants.^ The researcher noted three primary problems in the identification of competencies for the medical administrative assistant position: A lack of curricula, diverse roles, and a complex environment which has undergone radical change in recent years and will continue to evolve. Therefore, the purposes of the study were to use the DACUM process to develop a relevant list of competencies required by the medical administrative assistant practicing in physicians' offices in South Florida; determine the rank order of importance of each competency using a scale of one to five; cross-validate the DACUM group scores with a second population who did not participate in the DACUM process; and establish a basis for a curriculum framework for an occupational program.^ The DACUM process of curriculum development was selected because it seemed best suited to the need to develop a list of competencies for an occupation for which no programs existed. A panel of expert medical office administrative staff was selected to attend a 2-day workshop to describe their jobs in great detail. The panel, led by a trained facilitator, listed major duties and the respective tasks of their job. Brainstorming techniques were used to develop a consensus.^ Based upon the DACUM workshop, a survey was developed listing the 8 major duties and 71 tasks identified by the panel. The survey was mailed to the DACUM group and a second, larger population who did not participate in the DACUM. The survey results from the two groups were then compared. The non-DACUM group validated all but 3 of the 71 tasks listed by the DACUM panel. Because the three tasks were rated by the second group as at least "somewhat important" and rated "very important" by the DACUM group, the researcher recommended the inclusion of all 71 tasks in program development for this occupation. ^
Resumo:
[EN] Parasitic diseases have a great impact in human and animal health. The gold standard for the diagnosis of the majority of parasitic infections is still conventional microscopy, which presents important limitations in terms of sensitivity and specificity and commonly requires highly trained technicians. More accurate molecular-based diagnostic tools are needed for the implementation of early detection, effective treatments and massive screenings with high-throughput capacities. In this respect, sensitive and affordable devices could greatly impact on sustainable control programmes which exist against parasitic diseases, especially in low income settings. Proteomics and nanotechnology approaches are valuable tools for sensing pathogens and host alteration signatures within micro fluidic detection platforms. These new devices might provide novel solutions to fight parasitic diseases. Newly described specific parasite derived products with immune-modulatory properties have been postulated as the best candidates for the early and accurate detection of parasitic infections as well as for the blockage of parasite development. This review provides the most recent methodological and technological advances with great potential for biosensing parasites in their hosts, showing the newest opportunities offered by modern “-omics” and platforms for parasite detection and control.
Resumo:
It is recognized that sedentary behavior (SB) has deleterious effects on numerous health outcomes and it appears that physiological mechanisms underlying these harms are distinct from the ones explaining moderate-to-vigorous physical activity (MVPA) benefits. Sedentary behavior represents a large portion of human’s life and is increasing with technological development. A new current of opinion supports the idea that the manner SB is accumulated plays an important role. This dissertation presents six research studies conducted under the scope of SB. In the methodological area, the first study highlighted the magnitude of potential errors in estimating SB and its patterns from common alternative methods (accelerometer and heart rate monitor) compared to ActivPAL. This study presented the accelerometer as a valid method at a group level. Two studies (2 and 5) were performed in older adults (the most sedentary group in the population) to test the associations for SB patterns with abdominal obesity using accelerometry. The findings showed positive graded associations for prolonged sedentary bouts with abdominal obesity and showed that those who interrupted SB more frequently were less likely to present abdominal obesity. Therefore, public health recommendations regarding breaking up SB more often are expected to be relevant. The associations between sedentary patterns and abdominal obesity were independent of MVPA in older adults. However, the low MVPA in this group makes it unclear whether this independent relationship still exists if highly active persons are analysed. Study 3 inovates by examining the association of SB with body fatness in highly trained athletes and found SB to predict total fat mass and trunk fat mass, independently of age and weekly training time. Study 4 also brings novelty to this research field by quantifying the metabolic and energetic cost of the transition from sitting to standing and then sitting back down (a break), informing about the modest energetic costs (0.32 kcal·min−1). Finally, from a successful multicomponent pilot intervention to reduce and break up SB (study 6), an important behavioral resistance to make more sit/stand transitions despite successfully reducing sitting time (~ 1.85 hours·day-1) was found, which may be relevant to inform future behavioral modification programs. The present work provides observational and experimental evidence on the relation for SB patterns with body composition outcomes and energy regulation that may be relevant for public health interventions.
Resumo:
This thesis aimed to investigate the way in which distance runners modulate their speed in an effort to understand the key processes and determinants of speed selection when encountering hills in natural outdoor environments. One factor which has limited the expansion of knowledge in this area has been a reliance on the motorized treadmill which constrains runners to constant speeds and gradients and only linear paths. Conversely, limits in the portability or storage capacity of available technology have restricted field research to brief durations and level courses. Therefore another aim of this thesis was to evaluate the capacity of lightweight, portable technology to measure running speed in outdoor undulating terrain. The first study of this thesis assessed the validity of a non-differential GPS to measure speed, displacement and position during human locomotion. Three healthy participants walked and ran over straight and curved courses for 59 and 34 trials respectively. A non-differential GPS receiver provided speed data by Doppler Shift and change in GPS position over time, which were compared with actual speeds determined by chronometry. Displacement data from the GPS were compared with a surveyed 100m section, while static positions were collected for 1 hour and compared with the known geodetic point. GPS speed values on the straight course were found to be closely correlated with actual speeds (Doppler shift: r = 0.9994, p < 0.001, Δ GPS position/time: r = 0.9984, p < 0.001). Actual speed errors were lowest using the Doppler shift method (90.8% of values within ± 0.1 m.sec -1). Speed was slightly underestimated on a curved path, though still highly correlated with actual speed (Doppler shift: r = 0.9985, p < 0.001, Δ GPS distance/time: r = 0.9973, p < 0.001). Distance measured by GPS was 100.46 ± 0.49m, while 86.5% of static points were within 1.5m of the actual geodetic point (mean error: 1.08 ± 0.34m, range 0.69-2.10m). Non-differential GPS demonstrated a highly accurate estimation of speed across a wide range of human locomotion velocities using only the raw signal data with a minimal decrease in accuracy around bends. This high level of resolution was matched by accurate displacement and position data. Coupled with reduced size, cost and ease of use, the use of a non-differential receiver offers a valid alternative to differential GPS in the study of overground locomotion. The second study of this dissertation examined speed regulation during overground running on a hilly course. Following an initial laboratory session to calculate physiological thresholds (VO2 max and ventilatory thresholds), eight experienced long distance runners completed a self- paced time trial over three laps of an outdoor course involving uphill, downhill and level sections. A portable gas analyser, GPS receiver and activity monitor were used to collect physiological, speed and stride frequency data. Participants ran 23% slower on uphills and 13.8% faster on downhills compared with level sections. Speeds on level sections were significantly different for 78.4 ± 7.0 seconds following an uphill and 23.6 ± 2.2 seconds following a downhill. Speed changes were primarily regulated by stride length which was 20.5% shorter uphill and 16.2% longer downhill, while stride frequency was relatively stable. Oxygen consumption averaged 100.4% of runner’s individual ventilatory thresholds on uphills, 78.9% on downhills and 89.3% on level sections. Group level speed was highly predicted using a modified gradient factor (r2 = 0.89). Individuals adopted distinct pacing strategies, both across laps and as a function of gradient. Speed was best predicted using a weighted factor to account for prior and current gradients. Oxygen consumption (VO2) limited runner’s speeds only on uphill sections, and was maintained in line with individual ventilatory thresholds. Running speed showed larger individual variation on downhill sections, while speed on the level was systematically influenced by the preceding gradient. Runners who varied their pace more as a function of gradient showed a more consistent level of oxygen consumption. These results suggest that optimising time on the level sections after hills offers the greatest potential to minimise overall time when running over undulating terrain. The third study of this thesis investigated the effect of implementing an individualised pacing strategy on running performance over an undulating course. Six trained distance runners completed three trials involving four laps (9968m) of an outdoor course involving uphill, downhill and level sections. The initial trial was self-paced in the absence of any temporal feedback. For the second and third field trials, runners were paced for the first three laps (7476m) according to two different regimes (Intervention or Control) by matching desired goal times for subsections within each gradient. The fourth lap (2492m) was completed without pacing. Goals for the Intervention trial were based on findings from study two using a modified gradient factor and elapsed distance to predict the time for each section. To maintain the same overall time across all paced conditions, times were proportionately adjusted according to split times from the self-paced trial. The alternative pacing strategy (Control) used the original split times from this initial trial. Five of the six runners increased their range of uphill to downhill speeds on the Intervention trial by more than 30%, but this was unsuccessful in achieving a more consistent level of oxygen consumption with only one runner showing a change of more than 10%. Group level adherence to the Intervention strategy was lowest on downhill sections. Three runners successfully adhered to the Intervention pacing strategy which was gauged by a low Root Mean Square error across subsections and gradients. Of these three, the two who had the largest change in uphill-downhill speeds ran their fastest overall time. This suggests that for some runners the strategy of varying speeds systematically to account for gradients and transitions may benefit race performances on courses involving hills. In summary, a non – differential receiver was found to offer highly accurate measures of speed, distance and position across the range of human locomotion speeds. Self-selected speed was found to be best predicted using a weighted factor to account for prior and current gradients. Oxygen consumption limited runner’s speeds only on uphills, speed on the level was systematically influenced by preceding gradients, while there was a much larger individual variation on downhill sections. Individuals were found to adopt distinct but unrelated pacing strategies as a function of durations and gradients, while runners who varied pace more as a function of gradient showed a more consistent level of oxygen consumption. Finally, the implementation of an individualised pacing strategy to account for gradients and transitions greatly increased runners’ range of uphill-downhill speeds and was able to improve performance in some runners. The efficiency of various gradient-speed trade- offs and the factors limiting faster downhill speeds will however require further investigation to further improve the effectiveness of the suggested strategy.
Resumo:
McCambridge & Rollnick [1] argue that increased benefits from brief motivational interventions (MIs) for alcohol abuse may be obtained if they addressed patients’ con- cerns more directly, especially in severe dependence and primary care. We agree, but take the idea a step further. Recent research on comorbidity has illustrated the power of simultaneously addressing multiple issues in an integrated manner, especially when these changes have synergistic effects (as typically occurs with psycho- sis and substance use [2]). Integrated MI for comorbidity can even be used productively in a single-session format [3]. This idea may have wider application. Recent work in remote Indigenous Australian communities has highlighted the benefits of a broad-ranging discussion of key relationships, activities and resources that confer strength, as well as aspects that worry them or cause dissatisfaction [4]. If excessive drinking is present, its impact on other life areas is reviewed, as in standard MI. However, it is considered alongside other highly valued goals. While the approach has demonstrated effects on both alcohol use and mental health [5], its impact is restricted only by the range of goals that are selected...
Resumo:
This paper proposes a highly reliable fault diagnosis approach for low-speed bearings. The proposed approach first extracts wavelet-based fault features that represent diverse symptoms of multiple low-speed bearing defects. The most useful fault features for diagnosis are then selected by utilizing a genetic algorithm (GA)-based kernel discriminative feature analysis cooperating with one-against-all multicategory support vector machines (OAA MCSVMs). Finally, each support vector machine is individually trained with its own feature vector that includes the most discriminative fault features, offering the highest classification performance. In this study, the effectiveness of the proposed GA-based kernel discriminative feature analysis and the classification ability of individually trained OAA MCSVMs are addressed in terms of average classification accuracy. In addition, the proposedGA- based kernel discriminative feature analysis is compared with four other state-of-the-art feature analysis approaches. Experimental results indicate that the proposed approach is superior to other feature analysis methodologies, yielding an average classification accuracy of 98.06% and 94.49% under rotational speeds of 50 revolutions-per-minute (RPM) and 80 RPM, respectively. Furthermore, the individually trained MCSVMs with their own optimal fault features based on the proposed GA-based kernel discriminative feature analysis outperform the standard OAA MCSVMs, showing an average accuracy of 98.66% and 95.01% for bearings under rotational speeds of 50 RPM and 80 RPM, respectively.