939 resultados para Multiple-trait analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Recent literature demonstrates hyperglycemia to be common in patients with trauma and associated with poor outcome in patients with traumatic brain injury and critically ill patients. The goal of this study was to analyze the impact of admission blood glucose on the outcome of surviving patients with multiple injuries. METHODS: Patients' charts (age >16) admitted to the emergency room of the University Hospital of Berne, Switzerland, between January 1, 2002, and December 31, 2004, with an Injury Severity Score >or=17 and more than one severely injured organ system were reviewed retrospectively. Outcome measurements included morbidity, intensive care unit, and hospital length of stay. RESULTS: The inclusion criteria were met by 555 patients, of which 108 (19.5%) patients died. After multiple regression analysis, admission blood glucose proved to be an independent predictor of posttraumatic morbidity (p < 0.0001), intensive care unit, and hospital length of stay (p < 0.0001), despite intensified insulin therapy on the intensive care unit. CONCLUSIONS: In this population of patients with multiple injuries, hyperglycemia on admission was strongly associated with increased morbidity, especially infections, prolonged intensive care unit, and hospital length of stay independent of injury severity, gender, age, and various biochemical parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Impaired manual dexterity is frequent and disabling in patients with multiple sclerosis (MS). Therefore, convenient, quick and validated tests for manual dexterity in MS patients are needed. OBJECTIVE: The aim of this study was to validate the Coin Rotation task (CRT) to examine manual dexterity in patients with MS. DESIGN: Cross-sectional study. METHODS: 101 outpatients with MS were assessed with the CRT, the Expanded Disability Status Scale (EDSS), the Scale for the assessment and rating of ataxia (SARA), the Modified Ashworth Scale (MAS), and their muscle strength and sensory deficits of the hands were noted. Concurrent validity and diagnostic accuracy of the CRT were determined by comparison with the Nine Hole Peg Test (9HPT). Construct validity was determined by comparison with a valid dexterity questionnaire. Multiple regression analysis was done to explore correlations of the CRT with the EDSS, SARA, MAS, muscle strength and sensory deficits. RESULTS: The CRT correlated significantly with the 9HPT (r=.73, p<.0001) indicating good concurrent validity. The cut-off values for the CRT relative to the 9HPT were 18.75 seconds for the dominant (sensitivity: 81.5%; specificity 80.0%) and 19.25 seconds for the non-dominant hand (sensitivity: 90.3%; specificity: 81.8%) demonstrating good diagnostic accuracy. Furthermore, the CRT correlated significantly with the dexterity questionnaire (r=-.49, p<.0001) indicating moderate construct validity. Multiple regression analyses revealed that the EDSS was the strongest predictor for impaired dexterity. LIMITATIONS: Mostly relapsing-remitting MS patients with an EDSS up to 7 were examined. CONCLUSIONS: This study validates the CRT as a test that can be used easily and quickly to evaluate manual dexterity in patients with MS.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The relative influence of race, income, education, and Food Stamp Program participation/nonparticipation on the food and nutrient intake of 102 fecund women ages 18-45 years in a Florida urban clinic population was assessed using the technique of multiple regression analysis. Study subgroups were defined by race and Food Stamp Program participation status. Education was found to have the greatest influence on food and nutrient intake. Race was the next most influential factor followed in order by Food Stamp Program participation and income. The combined effect of the four independent variables explained no more than 19 percent of the variance for any of the food and nutrient intake variables. This would indicate that a more complex model of influences is needed if variations in food and nutrient intake are to be fully explained.^ A socioeconomic questionnaire was administered to investigate other factors of influence. The influence of the mother, frequency and type of restaurant dining, and perceptions of food intake and weight were found to be factors deserving further study.^ Dietary data were collected using the 24-hour recall and food frequency checklist. Descriptive dietary findings indicated that iron and calcium were nutrients where adequacy was of concern for all study subgroups. White Food Stamp Program participants had the greatest number of mean nutrient intake values falling below the 1980 Recommended Dietary Allowances (RDAs). When Food Stamp Program participants were contrasted to nonparticipants, mean intakes of six nutrients (kilocalories, calcium, iron, vitamin A, thiamin, and riboflavin) were below the 1980 RDA compared to five mean nutrient intakes (kilocalories, calcium, iron, thiamin and riboflavin) for the nonparticipants. Use of the Index of Nutritional Quality (INQ), however, revealed that the quality of the diet of Food Stamp Program participants per 1000 kilocalories was adequate with exception of calcium and iron. Intakes of these nutrients were also not adequate on a 1000 kilocalorie basis for the nonparticipant group. When mean nutrient intakes of the groups were compared using Student's t-test oleicacid intake was the only significant difference found. Being a nonparticipant in the Food Stamp Program was found to be associated with more frequent consumption of cookies, sweet rolls, doughnuts, and honey. The findings of this study contradict the negative image of the Food Stamp Program participant and emphasize the importance of education. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Developing countries are experiencing unprecedented levels of economic growth. As a result, they will be responsible for most of the future growth in energy demand and greenhouse gas (GHG) emissions. Curbing GHG emissions in developing countries has become one of the cornerstones of a future international agreement under the United Nations Framework Convention for Climate Change (UNFCCC). However, setting caps for developing countries’ GHG emissions has encountered strong resistance in the current round of negotiations. Continued economic growth that allows poverty eradication is still the main priority for most developing countries, and caps are perceived as a constraint to future growth prospects. The development, transfer and use of low-carbon technologies have more positive connotations, and are seen as the potential path towards low-carbon development. So far, the success of the UNFCCC process in improving the levels of technology transfer (TT) to developing countries has been limited. This thesis analyses the causes for such limited success and seeks to improve on the understanding about what constitutes TT in the field of climate change, establish the factors that enable them in developing countries and determine which policies could be implemented to reinforce these factors. Despite the wide recognition of the importance of technology and knowledge transfer to developing countries in the climate change mitigation policy agenda, this issue has not received sufficient attention in academic research. Current definitions of climate change TT barely take into account the perspective of actors involved in actual climate change TT activities, while respective measurements do not bear in mind the diversity of channels through which these happen and the outputs and effects that they convey. Furthermore, the enabling factors for TT in non-BRIC (Brazil, Russia, India, China) developing countries have been seldom investigated, and policy recommendations to improve the level and quality of TTs to developing countries have not been adapted to the specific needs of highly heterogeneous countries, commonly denominated as “developing countries”. This thesis contributes to enriching the climate change TT debate from the perspective of a smaller emerging economy (Chile) and by undertaking a quantitative analysis of enabling factors for TT in a large sample of developing countries. Two methodological approaches are used to study climate change TT: comparative case study analysis and quantitative analysis. Comparative case studies analyse TT processes in ten cases based in Chile, all of which share the same economic, technological and policy frameworks, thus enabling us to draw conclusions on the enabling factors and obstacles operating in TT processes. The quantitative analysis uses three methodologies – principal component analysis, multiple regression analysis and cluster analysis – to assess the performance of developing countries in a number of enabling factors and the relationship between these factors and indicators of TT, as well as to create groups of developing countries with similar performances. The findings of this thesis are structured to provide responses to four main research questions: What constitutes technology transfer and how does it happen? Is it possible to measure technology transfer, and what are the main challenges in doing so? Which factors enable climate change technology transfer to developing countries? And how do different developing countries perform in these enabling factors, and how can differentiated policy priorities be defined accordingly? vi Resumen Los paises en desarrollo estan experimentando niveles de crecimiento economico sin precedentes. Como consecuencia, se espera que sean responsables de la mayor parte del futuro crecimiento global en demanda energetica y emisiones de Gases de Efecto de Invernadero (GEI). Reducir las emisiones de GEI en los paises en desarrollo es por tanto uno de los pilares de un futuro acuerdo internacional en el marco de la Convencion Marco de las Naciones Unidas para el Cambio Climatico (UNFCCC). La posibilidad de compromisos vinculantes de reduccion de emisiones de GEI ha sido rechazada por los paises en desarrollo, que perciben estos limites como frenos a su desarrollo economico y a su prioridad principal de erradicacion de la pobreza. El desarrollo, transferencia y uso de tecnologias bajas en carbono tiene connotaciones mas positivas y se percibe como la via hacia un crecimiento bajo en carbono. Hasta el momento, la UNFCCC ha tenido un exito limitado en la promocion de transferencias de tecnologia (TT) a paises en desarrollo. Esta tesis analiza las causas de este resultado y busca mejorar la comprension sobre que constituye transferencia de tecnologia en el area de cambio climatico, cuales son los factores que la facilitan en paises en desarrollo y que politicas podrian implementarse para reforzar dichos factores. A pesar del extendido reconocimiento sobre la importancia de la transferencia de tecnologia a paises en desarrollo en la agenda politica de cambio climatico, esta cuestion no ha sido suficientemente atendida por la investigacion existente. Las definiciones actuales de transferencia de tecnologia relacionada con la mitigacion del cambio climatico no tienen en cuenta la diversidad de canales por las que se manifiestan o los efectos que consiguen. Los factores facilitadores de TT en paises en desarrollo no BRIC (Brasil, Rusia, India y China) apenas han sido investigados, y las recomendaciones politicas para aumentar el nivel y la calidad de la TT no se han adaptado a las necesidades especificas de paises muy heterogeneos aglutinados bajo el denominado grupo de "paises en desarrollo". Esta tesis contribuye a enriquecer el debate sobre la TT de cambio climatico con la perspectiva de una economia emergente de pequeno tamano (Chile) y el analisis cuantitativo de factores que facilitan la TT en una amplia muestra de paises en desarrollo. Se utilizan dos metodologias para el estudio de la TT a paises en desarrollo: analisis comparativo de casos de estudio y analisis cuantitativo basado en metodos multivariantes. Los casos de estudio analizan procesos de TT en diez casos basados en Chile, para derivar conclusiones sobre los factores que facilitan u obstaculizan el proceso de transferencia. El analisis cuantitativo multivariante utiliza tres metodologias: regresion multiple, analisis de componentes principales y analisis cluster. Con dichas metodologias se busca analizar el posicionamiento de diversos paises en cuanto a factores que facilitan la TT; las relaciones entre dichos factores e indicadores de transferencia tecnologica; y crear grupos de paises con caracteristicas similares que podrian beneficiarse de politicas similares para la promocion de la transferencia de tecnologia. Los resultados de la tesis se estructuran en torno a cuatro preguntas de investigacion: .Que es la transferencia de tecnologia y como ocurre?; .Es posible medir la transferencia de tecnologias de bajo carbono?; .Que factores facilitan la transferencia de tecnologias de bajo carbono a paises en desarrollo? y .Como se puede agrupar a los paises en desarrollo en funcion de sus necesidades politicas para la promocion de la transferencia de tecnologias de bajo carbono?

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, multiple regression analysis is used to model the top of descent (TOD) location of user-preferred descent trajectories computed by the flight management system (FMS) on over 1000 commercial flights into Melbourne, Australia. In addition to recording TOD, the cruise altitude, final altitude, cruise Mach, descent speed, wind, and engine type were also identified for use as the independent variables in the regression analysis. Both first-order and second-order models are considered, where cross-validation, hypothesis testing, and additional analysis are used to compare models. This identifies the models that should give the smallest errors if used to predict TOD location for new data in the future. A model that is linear in TOD altitude, final altitude, descent speed, and wind gives an estimated standard deviation of 3.9 nmi for TOD location given the trajectory parame- ters, which means about 80% of predictions would have error less than 5 nmi in absolute value. This accuracy is better than demonstrated by other ground automation predictions using kinetic models. Furthermore, this approach would enable online learning of the model. Additional data or further knowledge of algorithms is necessary to conclude definitively that no second-order terms are appropriate. Possible applications of the linear model are described, including enabling arriving aircraft to fly optimized descents computed by the FMS even in congested airspace.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many multifactorial biologic effects, particularly in the context of complex human diseases, are still poorly understood. At the same time, the systematic acquisition of multivariate data has become increasingly easy. The use of such data to analyze and model complex phenotypes, however, remains a challenge. Here, a new analytic approach is described, termed coreferentiality, together with an appropriate statistical test. Coreferentiality is the indirect relation of two variables of functional interest in respect to whether they parallel each other in their respective relatedness to multivariate reference data, which can be informative for a complex effect or phenotype. It is shown that the power of coreferentiality testing is comparable to multiple regression analysis, sufficient even when reference data are informative only to a relatively small extent of 2.5%, and clearly exceeding the power of simple bivariate correlation testing. Thus, coreferentiality testing uses the increased power of multivariate analysis, however, in order to address a more straightforward interpretable bivariate relatedness. Systematic application of this approach could substantially improve the analysis and modeling of complex phenotypes, particularly in the context of human study where addressing functional hypotheses by direct experimentation is often difficult.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study was to apply multifailure survival methods to analyze time to multiple occurrences of basal cell carcinoma (BCC). Data from 4.5 years of follow-up in a randomized controlled trial, the Nambour Skin Cancer Prevention Trial (1992-1996), to evaluate skin cancer prevention were used to assess the influence of sunscreen application on the time to first BCC and the time to subsequent BCCs. Three different approaches of time to ordered multiple events were applied and compared: the Andersen-Gill, Wei-Lin-Weissfeld, and Prentice-Williams-Peterson models. Robust variance estimation approaches were used for all multifailure survival models. Sunscreen treatment was not associated with time to first occurrence of a BCC (hazard ratio = 1.04, 95% confidence interval: 0.79, 1.45). Time to subsequent BCC tumors using the Andersen-Gill model resulted in a lower estimated hazard among the daily sunscreen application group, although statistical significance was not reached (hazard ratio = 0.82, 95% confidence interval: 0.59, 1.15). Similarly, both the Wei-Lin-Weissfeld marginal-hazards and the Prentice-Williams-Peterson gap-time models revealed trends toward a lower risk of subsequent BCC tumors among the sunscreen intervention group. These results demonstrate the importance of conducting multiple-event analysis for recurring events, as risk factors for a single event may differ from those where repeated events are considered.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The objective was to determine whether the pattern of environmental and genetic influences on deviant personality scores differs from that observed for the normative range of personality, comparing results in adolescent and adult female twins. Methods: A sample of 2,796 female adolescent twins ascertained from birth records provided Junior Eysenck Personality Questionnaire data. The average age of the sample was 17.0 years ( S. D. 2.3). Genetic analyses of continuous and extreme personality scores were conducted. Results were compared for 3,178 adult female twins. Results: Genetic analysis of continuous traits in adolescent female twins were similar to findings in adult female twins, with genetic influences accounting for between 37% and 44% of the variance in Extraversion (Ex), Neuroticism (N), and Social Non-Conformity (SNC), with significant evidence of shared environmental influences (19%) found only for SNC in the adult female twins. Analyses of extreme personality characteristics, defined categorically, in the adolescent data and replicated in the adult female data, yielded estimates for high N and high SNC that deviated substantially (p < .05) from those obtained in the continuous trait analyses, and provided suggestive evidence that shared family environment may play a more important role in determining personality deviance than has been previously found when personality is viewed continuously. However, multiple-threshold models that assumed the same genetic and environmental determinants of both normative range variation and extreme scores gave acceptable fits for each personality dimension. Conclusions: The hypothesis of differences in genetic or environmental factors responsible for N and SNC among female twins with scores in the extreme versus normative ranges was partially supported, but not for Ex.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Univariate linkage analysis is used routinely to localise genes for human complex traits. Often, many traits are analysed but the significance of linkage for each trait is not corrected for multiple trait testing, which increases the experiment-wise type-I error rate. In addition, univariate analyses do not realise the full power provided by multivariate data sets. Multivariate linkage is the ideal solution but it is computationally intensive, so genome-wide analysis and evaluation of empirical significance are often prohibitive. We describe two simple methods that efficiently alleviate these caveats by combining P-values from multiple univariate linkage analyses. The first method estimates empirical pointwise and genome-wide significance between one trait and one marker when multiple traits have been tested. It is as robust as an appropriate Bonferroni adjustment, with the advantage that no assumptions are required about the number of independent tests performed. The second method estimates the significance of linkage between multiple traits and one marker and, therefore, it can be used to localise regions that harbour pleiotropic quantitative trait loci (QTL). We show that this method has greater power than individual univariate analyses to detect a pleiotropic QTL across different situations. In addition, when traits are moderately correlated and the QTL influences all traits, it can outperform formal multivariate VC analysis. This approach is computationally feasible for any number of traits and was not affected by the residual correlation between traits. We illustrate the utility of our approach with a genome scan of three asthma traits measured in families with a twin proband.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Researchers often use 3-way interactions in moderated multiple regression analysis to test the joint effect of 3 independent variables on a dependent variable. However, further probing of significant interaction terms varies considerably and is sometimes error prone. The authors developed a significance test for slope differences in 3-way interactions and illustrate its importance for testing psychological hypotheses. Monte Carlo simulations revealed that sample size, magnitude of the slope difference, and data reliability affected test power. Application of the test to published data yielded detection of some slope differences that were undetected by alternative probing techniques and led to changes of results and conclusions. The authors conclude by discussing the test's applicability for psychological research. Copyright 2006 by the American Psychological Association.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Discriminant analysis (also known as discriminant function analysis or multiple discriminant analysis) is a multivariate statistical method of testing the degree to which two or more populations may overlap with each other. It was devised independently by several statisticians including Fisher, Mahalanobis, and Hotelling ). The technique has several possible applications in Microbiology. First, in a clinical microbiological setting, if two different infectious diseases were defined by a number of clinical and pathological variables, it may be useful to decide which measurements were the most effective at distinguishing between the two diseases. Second, in an environmental microbiological setting, the technique could be used to study the relationships between different populations, e.g., to what extent do the properties of soils in which the bacterium Azotobacter is found differ from those in which it is absent? Third, the method can be used as a multivariate ‘t’ test , i.e., given a number of related measurements on two groups, the analysis can provide a single test of the hypothesis that the two populations have the same means for all the variables studied. This statnote describes one of the most popular applications of discriminant analysis in identifying the descriptive variables that can distinguish between two populations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The known moss flora of Terra Nova National Park, eastern Newfoundland, comp~ises 210 species. Eighty-two percent of the moss species occurring in Terra Nova are widespread or widespread-sporadic in Newfoundland. Other Newfoundland distributional elements present in the Terra Nova moss flora are the northwestern, southern, southeastern, and disjunct elements, but four of the mosses occurring in Terra Nova appear to belong to a previously unrecognized northeastern element of the Newfoundland flora. The majority (70.9%) of Terra Nova's mosses are of boreal affinity and are widely distributed in the North American coniferous forest belt. An additional 10.5 percent of the Terra Nova mosses are cosmopolitan while 9.5 percent are temperate and 4.8 percent are arctic-montane species. The remaining 4.3 percent of the mosses are of montane affinity, and disjunct between eastern and western North America. In Terra Nova, temperate species at their northern limit are concentrated in balsam fir stands, while arctic-montane species are restricted to exposed cliffs, scree slopes, and coastal exposures. Montane species are largely confined to exposed or freshwater habitats. Inability to tolerate high summer temperatures limits the distributions of both arctic-montane and montane species. In Terra Nova, species of differing phytogeographic affinities co-occur on cliffs and scree slopes. The microhabitat relationships of five selected species from such habitats were evaluated by Discriminant Functions Analysis and Multiple Regression Analysis. The five mosses have distinct and different microhabitats on cliffs and scree slopes in Terra Nova, and abundance of all but one is associated with variation in at least one microhabitat variable. Micro-distribution of Grimmia torquata, an arctic-montane species at its southern limit, appears to be deterJ]lined by sensitivity to high summer temperatures. Both southern mosses at their northern limit (Aulacomnium androgynum, Isothecium myosuroides) appear to be limited by water availability and, possibly, by low winter temperatures. The two species whose distributions extend both north and south or the study area (Encalypta procera, Eurhynchium pulchellum) show no clear relationship with microclimate. Dispersal factors have played a significant role in the development of the Terra Nova moss flora. Compared to the most likely colonizing source (i .e. the rest of the island of Newfoundland), species with small diaspores have colonized the study area to a proportionately much greater extent than have species with large diaspores. Hierarchical log-linear analysis indicates that this is so for all affinity groups present in Terra Nova. The apparent dispersal effects emphasize the comparatively recent glaciation of the area, and may also have been enhanced by anthropogenic influences. The restriction of some species to specific habitats, or to narrowly defined microhabitats, appears to strengthen selection for easily dispersed taxa.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problems faced by scientists in charge of managing Atlantic salmon (Salmo salar) stocks are : i) how to maintain spawning runs consisting of repeat spawners and large multi-sea-winter (MSW) adults in the face of selective homewater and distant commercial fisheries and , ii) how to more accurately predict returns of adults. Using data from scales collected from maiden Atlantic salmon grilse from two locations on the Northern Peninsula of Newfoundland, St. Barbe Bay and Western Arm Brook, their length at smolting was back calculated. These data were then used to examine whether the St. Barbe commercial fishery is selective for salmon of particular smolt age and/or size. Analysis indicated that come commercial fishery selected larger, but not necessarily older adults that those escaping to Western Arm Brook over the period of this study, 1978-1987. It was determined that less than average size smolts survived better than above average size smolts. Slection for repeat spawners, large MSW salmon, and larger grilse has meant reductions in the proportions of these adults in the spawning runs on Western Arm Brook. This may impact the Western Arm Brook salmon stock by increasing the population instability. Sea survival was significantly correlated with selection by the commercial fishery. Characteristics of adults in Western Arm Brook during the period of study (1978-1987) did not help in explaining yearly variation in sea survival. The characteristics of smolts, however, when subjected to multiple regression analysis explained 57.2 percent of the yearly variation in sea survival.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose – The objective of this exploratory study is to investigate the “flow-through” or relationship between top-line measures of hotel operating performance (occupancy, average daily rate and revenue per available room) and bottom-line measures of profitability (gross operating profit and net operating income), before and during the recent great recession. Design/methodology/approach – This study uses data provided by PKF Hospitality Research for the period from 2007-2009. A total of 714 hotels were analyzed and various top-line and bottom-line profitability changes were computed using both absolute levels and percentages. Multiple regression analysis was used to examine the relationship between top and bottom line measures, and to derive flow-through ratios. Findings – The results show that average daily rate (ADR) and occupancy are significantly and positively related to gross operating profit per available room (GOPPAR) and net operating income per available room (NOIPAR). The evidence indicates that ADR, rather than occupancy, appears to be the stronger predictor and better measure of RevPAR growth and bottom-line profitability. The correlations and explained variances are also higher than those reported in prior research. Flow-through ratios range between 1.83 and 1.91 for NOIPAR, and between 1.55 and 1.65 for GOPPAR, across all chain-scales. Research limitations/implications – Limitations of this study include the limited number of years in the study period, limited number of hotels in a competitive set, and self-selection of hotels by the researchers. Practical implications – While ADR and occupancy work in combination to drive profitability, the authors' study shows that ADR is the stronger predictor of profitability. Hotel managers can use flow-through ratios to make financial forecasts, or use them as inputs in valuation models, to forecast future profitability. Originality/value – This paper extends prior research on the relationship between top-line measures and bottom-line profitability and serves to inform lodging owners, operators and asset managers about flow-through ratios, and how these ratios impact hotel profitability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.