13 resultados para data analysis: algorithms and implementation
em Digital Commons at Florida International University
Resumo:
This dissertation established a software-hardware integrated design for a multisite data repository in pediatric epilepsy. A total of 16 institutions formed a consortium for this web-based application. This innovative fully operational web application allows users to upload and retrieve information through a unique human-computer graphical interface that is remotely accessible to all users of the consortium. A solution based on a Linux platform with My-SQL and Personal Home Page scripts (PHP) has been selected. Research was conducted to evaluate mechanisms to electronically transfer diverse datasets from different hospitals and collect the clinical data in concert with their related functional magnetic resonance imaging (fMRI). What was unique in the approach considered is that all pertinent clinical information about patients is synthesized with input from clinical experts into 4 different forms, which were: Clinical, fMRI scoring, Image information, and Neuropsychological data entry forms. A first contribution of this dissertation was in proposing an integrated processing platform that was site and scanner independent in order to uniformly process the varied fMRI datasets and to generate comparative brain activation patterns. The data collection from the consortium complied with the IRB requirements and provides all the safeguards for security and confidentiality requirements. An 1-MR1-based software library was used to perform data processing and statistical analysis to obtain the brain activation maps. Lateralization Index (LI) of healthy control (HC) subjects in contrast to localization-related epilepsy (LRE) subjects were evaluated. Over 110 activation maps were generated, and their respective LIs were computed yielding the following groups: (a) strong right lateralization: (HC=0%, LRE=18%), (b) right lateralization: (HC=2%, LRE=10%), (c) bilateral: (HC=20%, LRE=15%), (d) left lateralization: (HC=42%, LRE=26%), e) strong left lateralization: (HC=36%, LRE=31%). Moreover, nonlinear-multidimensional decision functions were used to seek an optimal separation between typical and atypical brain activations on the basis of the demographics as well as the extent and intensity of these brain activations. The intent was not to seek the highest output measures given the inherent overlap of the data, but rather to assess which of the many dimensions were critical in the overall assessment of typical and atypical language activations with the freedom to select any number of dimensions and impose any degree of complexity in the nonlinearity of the decision space.
Resumo:
This study examines the congruency of planning between organizational structure and process, through an evaluation and planning model known as the Micro/Macro Dynamic Planning Grid. The model compares day-to-day planning within an organization to planning imposed by organizational administration and accrediting agencies. A survey instrument was developed to assess the micro and macro sociological analysis elements utilized by an organization.^ The Micro/Macro Dynamic Planning Grid consists of four quadrants. Each quadrant contains characteristics that reflect the interaction between the micro and macro elements of planning, objectives and goals within an organization. The Over Macro/Over Micro, Quadrant 1, contains attributes that reflect a tremendous amount of action and ongoing adjustments, typical of an organization undergoing significant changes in either leadership, program and/or structure. Over Macro/Under Micro, Quadrant 2, reflects planning characteristics found in large, bureaucratic systems with little regard given to the workings of their component parts. Under Macro/Under Micro, Quadrant 3, reflects the uncooperative, uncoordinated organization, one that contains a multiplicity of viewpoints, language, objectives and goals. Under Macro/Under Micro, Quadrant 4 represents the worst case scenario for any organization. The attributes of this quadrant are very reactive, chaotic, non-productive and redundant.^ There were three phases to the study: development of the initial instrument, pilot testing the initial instrument and item revision, and administration and assessment of the refined instrument. The survey instrument was found to be valid and reliable for the purposes and audiences herein described.^ In order to expand the applicability of the instrument to other organizational settings, the survey was administered to three professional colleges within a university.^ The first three specific research questions collectively answered, in the affirmative, the basic research question: Can the Micro/Macro Dynamic Planning Grid be applied to an organization through an organizational development tool? The first specific question: Can an instrument be constructed that applies the Micro/Macro Dynamic Planning Grid? The second specific research question: Is the constructed instrument valid and reliable? The third specific research question: Does an instrument that applies the Micro/Macro Dynamic Planning Grid assess congruency of micro and macro planning, goals and objectives within an organization? The fourth specific research question: What are the differences in the responses based on roles and responsibilities within an organization? involved statistical analysis of the response data and comparisons obtained with the demographic data. (Abstract shortened by UMI.) ^
Resumo:
The purpose of this study was to explore the relationship between faculty perceptions, selected demographics, implementation of elements of transactional distance theory and online web-based course completion rates. This theory posits that the high transactional distance of online courses makes it difficult for students to complete these courses successfully; too often this is associated with low completion rates. Faculty members play an indispensable role in course design, whether online or face-to-face. They also influence course delivery format from design through implementation and ultimately to how students will experience the course. This study used transactional distance theory as the conceptual framework to examine the relationship between teaching and learning strategies used by faculty members to help students complete online courses. Faculty members' sex, number of years teaching online at the college, and their online course completion rates were considered. A researcher-developed survey was used to collect data from 348 faculty members who teach online at two prominent colleges in the southeastern part of United States. An exploratory factor analysis resulted in six factors related to transactional distance theory. The factors accounted for slightly over 65% of the variance of transactional distance scores as measured by the survey instrument. Results provided support for Moore's (1993) theory of transactional distance. Female faculty members scored higher in all the factors of transactional distance theory when compared to men. Faculty number of years teaching online at the college level correlated significantly with all the elements of transactional distance theory. Regression analysis was used to determine that two of the factors, instructor interface and instructor-learner interaction, accounted for 12% of the variance in student online course completion rates. In conclusion, of the six factors found, the two with the highest percentage scores were instructor interface and instructor-learner interaction. This finding, while in alignment with the literature concerning the dialogue element of transactional distance theory, brings a special interest to the importance of instructor interface as a factor. Surprisingly, based on the reviewed literature on transactional distance theory, faculty perceptions concerning learner-learner interaction was not an important factor and there was no learner-content interaction factor.
Resumo:
This is an empirical study whose purpose was to examine the process of innovation adoption as an adaptive response by a public organization and its subunits existing under varying degrees of environmental uncertainty. Meshing organization innovation research and contingency theory to form a theoretical framework, an exploratory case study design was undertaken in a large, metropolitan government located in an area with the fourth highest prevalence rate of HIV/AIDS in the country. A number of environmental and organizational factors were examined for their influence upon decision making in the adoption/non-adoption as well as implementation of any number of AIDS-related policies, practices, and programs.^ The major findings of the study are as follows. For the county government itself (macro level), no AIDS-specific workplace policies have been adopted. AIDS activities (AIDS education, AIDS Task Force, AIDS Coordinator, etc.), adopted county-wide early in the epidemic, have all been abandoned. Worker infection rates, in the aggregate and throughout the epidemic have been small. As a result, absent co-worker conflict (isolated and negligible), no increase in employee health care costs, no litigation regarding discrimination, and no major impact on workforce productivity, AIDS has basically become a non-issue at the strategic core of the organization. At the departmental level, policy adoption decisions varied widely. Here the predominant issue is occupational risk, i.e., both objective as well as perceived. As expected, more AIDS-related activities (policies, practices, and programs) were found in departments with workers known to have significant risk for exposure to the AIDS virus (fire rescue, medical examiner, police, etc.). AIDS specific policies, in the form of OSHA's Bloodborn Pathogen Standard, took place primarily because they were legislatively mandated. Union participation varied widely, although not necessarily based upon worker risk. In several departments, the union was a primary factor bringing about adoption decisions. Additional factors were identified and included organizational presence of AIDS expertise, availability of slack resources, and the existence of a policy champion. Other variables, such as subunit size, centralization of decision making, and formalization were not consistent factors explaining adoption decisions. ^
Resumo:
The Convention on Biodiversity (CBD) was created in 1992 to coordinate global governments to protect biological resources. The CBD has three goals: protection of biodiversity, achievement of sustainable use of biodiversity and facilitation of equitable sharing of the benefits of biological resources. The goal of protecting biological resources has remained both controversial and difficult to implement. This study focused more on the goal of biodiversity protection. The research was designed to examine how globally constructed environmental policies get adapted by national governments and then passed down to local levels where actual implementation takes place. Effectiveness of such policies depends on the extent of actual implementation at local levels. Therefore, compliance was divided and examined at three levels: global, national and local. The study then developed various criteria to measure compliance at these levels. Both qualitative and quantitative methods were used to analyze compliance and implementation. The study was guided by three questions broadly examining critical factors that most influence the implementation of biodiversity protection policies at the global, national and local levels. Findings show that despite an overall biodiversity deficit of 0.9 hectares per person, global compliance with the CBD goals is currently at 35%. Compliance is lowest at local levels at 14%, it is slightly better at national level at 50%, and much better at the international level 64%. Compliance appears higher at both national and international levels because compliance here is paper work based and policy formulation. If implementation at local levels continues to produce this low compliance, overall conservation outcomes can only get worse than what it is at present. There are numerous weaknesses and capacity challenges countries are yet to address in their plans. In order to increase local level compliance, the study recommends a set of robust policies that build local capacity, incentivize local resource owners, and implement biodiversity protection programs that are akin to local needs and aspirations.^
Resumo:
The Convention on Biodiversity (CBD) was created in 1992 to coordinate global governments to protect biological resources. The CBD has three goals: protection of biodiversity, achievement of sustainable use of biodiversity and facilitation of equitable sharing of the benefits of biological resources. The goal of protecting biological resources has remained both controversial and difficult to implement. This study focused more on the goal of biodiversity protection. The research was designed to examine how globally constructed environmental policies get adapted by national governments and then passed down to local levels where actual implementation takes place. Effectiveness of such policies depends on the extent of actual implementation at local levels. Therefore, compliance was divided and examined at three levels: global, national and local. The study then developed various criteria to measure compliance at these levels. Both qualitative and quantitative methods were used to analyze compliance and implementation. The study was guided by three questions broadly examining critical factors that most influence the implementation of biodiversity protection policies at the global, national and local levels. Findings show that despite an overall biodiversity deficit of 0.9 hectares per person, global compliance with the CBD goals is currently at 35%. Compliance is lowest at local levels at 14%, it is slightly better at national level at 50%, and much better at the international level 64%. Compliance appears higher at both national and international levels because compliance here is paper work based and policy formulation. If implementation at local levels continues to produce this low compliance, overall conservation outcomes can only get worse than what it is at present. There are numerous weaknesses and capacity challenges countries are yet to address in their plans. In order to increase local level compliance, the study recommends a set of robust policies that build local capacity, incentivize local resource owners, and implement biodiversity protection programs that are akin to local needs and aspirations.
Resumo:
The purpose of this study was to explore the relationship between faculty perceptions, selected demographics, implementation of elements of transactional distance theory and online web-based course completion rates. This theory posits that the high transactional distance of online courses makes it difficult for students to complete these courses successfully; too often this is associated with low completion rates. Faculty members play an indispensable role in course design, whether online or face-to-face. They also influence course delivery format from design through implementation and ultimately to how students will experience the course. This study used transactional distance theory as the conceptual framework to examine the relationship between teaching and learning strategies used by faculty members to help students complete online courses. Faculty members’ sex, number of years teaching online at the college, and their online course completion rates were considered. A researcher-developed survey was used to collect data from 348 faculty members who teach online at two prominent colleges in the southeastern part of United States. An exploratory factor analysis resulted in six factors related to transactional distance theory. The factors accounted for slightly over 65% of the variance of transactional distance scores as measured by the survey instrument. Results provided support for Moore’s (1993) theory of transactional distance. Female faculty members scored higher in all the factors of transactional distance theory when compared to men. Faculty number of years teaching online at the college level correlated significantly with all the elements of transactional distance theory. Regression analysis was used to determine that two of the factors, instructor interface and instructor-learner interaction, accounted for 12% of the variance in student online course completion rates. In conclusion, of the six factors found, the two with the highest percentage scores were instructor interface and instructor-learner interaction. This finding, while in alignment with the literature concerning the dialogue element of transactional distance theory, brings a special interest to the importance of instructor interface as a factor. Surprisingly, based on the reviewed literature on transactional distance theory, faculty perceptions concerning learner-learner interaction was not an important factor and there was no learner-content interaction factor.
Resumo:
Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.
Resumo:
This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^
Resumo:
This study is to investigate the middle school teachers’ concerns and perspectives during the implementation of an evidence-based curriculum that supports the development of both content knowledge and scientific practices. Two themes emerge from data analysis: consonance and conflict.
Resumo:
Background: Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results: We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions: We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.
Resumo:
Personalized recommender systems aim to assist users in retrieving and accessing interesting items by automatically acquiring user preferences from the historical data and matching items with the preferences. In the last decade, recommendation services have gained great attention due to the problem of information overload. However, despite recent advances of personalization techniques, several critical issues in modern recommender systems have not been well studied. These issues include: (1) understanding the accessing patterns of users (i.e., how to effectively model users' accessing behaviors); (2) understanding the relations between users and other objects (i.e., how to comprehensively assess the complex correlations between users and entities in recommender systems); and (3) understanding the interest change of users (i.e., how to adaptively capture users' preference drift over time). To meet the needs of users in modern recommender systems, it is imperative to provide solutions to address the aforementioned issues and apply the solutions to real-world applications. ^ The major goal of this dissertation is to provide integrated recommendation approaches to tackle the challenges of the current generation of recommender systems. In particular, three user-oriented aspects of recommendation techniques were studied, including understanding accessing patterns, understanding complex relations and understanding temporal dynamics. To this end, we made three research contributions. First, we presented various personalized user profiling algorithms to capture click behaviors of users from both coarse- and fine-grained granularities; second, we proposed graph-based recommendation models to describe the complex correlations in a recommender system; third, we studied temporal recommendation approaches in order to capture the preference changes of users, by considering both long-term and short-term user profiles. In addition, a versatile recommendation framework was proposed, in which the proposed recommendation techniques were seamlessly integrated. Different evaluation criteria were implemented in this framework for evaluating recommendation techniques in real-world recommendation applications. ^ In summary, the frequent changes of user interests and item repository lead to a series of user-centric challenges that are not well addressed in the current generation of recommender systems. My work proposed reasonable solutions to these challenges and provided insights on how to address these challenges using a simple yet effective recommendation framework.^