13 resultados para Connectivity,Connected Car,Big Data,KPI
em Digital Commons at Florida International University
Resumo:
Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.
Resumo:
The Three-Layer distributed mediation architecture, designed by Secure System Architecture laboratory, employed a layered framework of presence, integration, and homogenization mediators. The architecture does not have any central component that may affect the system reliability. A distributed search technique was adapted in the system to increase its reliability. An Enhanced Chord-like algorithm (E-Chord) was designed and deployed in the integration layer. The E-Chord is a skip-list algorithm based on Distributed Hash Table (DHT) which is a distributed but structured architecture. DHT is distributed in the sense that no central unit is required to maintain indexes, and it is structured in the sense that indexes are distributed over the nodes in a systematic manner. Each node maintains three kind of routing information: a frequency list, a successor/predecessor list, and a finger table. None of the nodes in the system maintains all indexes, and each node knows about some other nodes in the system. These nodes, also called composer mediators, were connected in a P2P fashion. ^ A special composer mediator called a global mediator initiates the keyword-based matching decomposition of the request using the E-Chord. It generates an Integrated Data Structure Graph (IDSG) on the fly, creates association and dependency relations between nodes in the IDSG, and then generates a Global IDSG (GIDSG). The GIDSG graph is a plan which guides the global mediator how to integrate data. It is also used to stream data from the mediators in the homogenization layer which connected to the data sources. The connectors start sending the data to the global mediator just after the global mediator creates the GIDSG and just before the global mediator sends the answer to the presence mediator. Using the E-Chord and GIDSG made the mediation system more scalable than using a central global schema repository since all the composers in the integration layer are capable of handling and routing requests. Also, when a composer fails, it would only minimally affect the entire mediation system. ^
Resumo:
Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.
Resumo:
This symposium describes a multi-dimensional strategy to examine fidelity of implementation in an authentic school district context. An existing large-district peer mentoring program provides an example. The presentation will address development of a logic model to articulate a theory of change; collaborative creation of a data set aligned with essential concepts and research questions; identification of independent, dependent, and covariate variables; issues related to use of big data that include conditioning and transformation of data prior to analysis; operationalization of a strategy to capture fidelity of implementation data from all stakeholders; and ways in which fidelity indicators might be used.
Resumo:
Because some Web users will be able to design a template to visualize information from scratch, while other users need to automatically visualize information by changing some parameters, providing different levels of customization of the information is a desirable goal. Our system allows the automatic generation of visualizations given the semantics of the data, and the static or pre-specified visualization by creating an interface language. We address information visualization taking into consideration the Web, where the presentation of the retrieved information is a challenge. ^ We provide a model to narrow the gap between the user's way of expressing queries and database manipulation languages (SQL) without changing the system itself thus improving the query specification process. We develop a Web interface model that is integrated with the HTML language to create a powerful language that facilitates the construction of Web-based database reports. ^ As opposed to other papers, this model offers a new way of exploring databases focusing on providing Web connectivity to databases with minimal or no result buffering, formatting, or extra programming. We describe how to easily connect the database to the Web. In addition, we offer an enhanced way on viewing and exploring the contents of a database, allowing users to customize their views depending on the contents and the structure of the data. Current database front-ends typically attempt to display the database objects in a flat view making it difficult for users to grasp the contents and the structure of their result. Our model narrows the gap between databases and the Web. ^ The overall objective of this research is to construct a model that accesses different databases easily across the net and generates SQL, forms, and reports across all platforms without requiring the developer to code a complex application. This increases the speed of development. In addition, using only the Web browsers, the end-user can retrieve data from databases remotely to make necessary modifications and manipulations of data using the Web formatted forms and reports, independent of the platform, without having to open different applications, or learn to use anything but their Web browser. We introduce a strategic method to generate and construct SQL queries, enabling inexperienced users that are not well exposed to the SQL world to build syntactically and semantically a valid SQL query and to understand the retrieved data. The generated SQL query will be validated against the database schema to ensure harmless and efficient SQL execution. (Abstract shortened by UMI.)^
Resumo:
Graph-structured databases are widely prevalent, and the problem of effective search and retrieval from such graphs has been receiving much attention recently. For example, the Web can be naturally viewed as a graph. Likewise, a relational database can be viewed as a graph where tuples are modeled as vertices connected via foreign-key relationships. Keyword search querying has emerged as one of the most effective paradigms for information discovery, especially over HTML documents in the World Wide Web. One of the key advantages of keyword search querying is its simplicity—users do not have to learn a complex query language, and can issue queries without any prior knowledge about the structure of the underlying data. The purpose of this dissertation was to develop techniques for user-friendly, high quality and efficient searching of graph structured databases. Several ranked search methods on data graphs have been studied in the recent years. Given a top-k keyword search query on a graph and some ranking criteria, a keyword proximity search finds the top-k answers where each answer is a substructure of the graph containing all query keywords, which illustrates the relationship between the keyword present in the graph. We applied keyword proximity search on the web and the page graph of web documents to find top-k answers that satisfy user’s information need and increase user satisfaction. Another effective ranking mechanism applied on data graphs is the authority flow based ranking mechanism. Given a top- k keyword search query on a graph, an authority-flow based search finds the top-k answers where each answer is a node in the graph ranked according to its relevance and importance to the query. We developed techniques that improved the authority flow based search on data graphs by creating a framework to explain and reformulate them taking in to consideration user preferences and feedback. We also applied the proposed graph search techniques for Information Discovery over biological databases. Our algorithms were experimentally evaluated for performance and quality. The quality of our method was compared to current approaches by using user surveys.
Resumo:
High efficiency of power converters placed between renewable energy sources and the utility grid is required to maximize the utilization of these sources. Power quality is another aspect that requires large passive elements (inductors, capacitors) to be placed between these sources and the grid. The main objective is to develop higher-level high frequency-based power converter system (HFPCS) that optimizes the use of hybrid renewable power injected into the power grid. The HFPCS provides high efficiency, reduced size of passive components, higher levels of power density realization, lower harmonic distortion, higher reliability, and lower cost. The dynamic modeling for each part in this system is developed, simulated and tested. The steady-state performance of the grid-connected hybrid power system with battery storage is analyzed. Various types of simulations were performed and a number of algorithms were developed and tested to verify the effectiveness of the power conversion topologies. A modified hysteresis-control strategy for the rectifier and the battery charging/discharging system was developed and implemented. A voltage oriented control (VOC) scheme was developed to control the energy injected into the grid. The developed HFPCS was compared experimentally with other currently available power converters. The developed HFPCS was employed inside a microgrid system infrastructure, connecting it to the power grid to verify its power transfer capabilities and grid connectivity. Grid connectivity tests verified these power transfer capabilities of the developed converter in addition to its ability of serving the load in a shared manner. In order to investigate the performance of the developed system, an experimental setup for the HF-based hybrid generation system was constructed. We designed a board containing a digital signal processor chip on which the developed control system was embedded. The board was fabricated and experimentally tested. The system's high precision requirements were verified. Each component of the system was built and tested separately, and then the whole system was connected and tested. The simulation and experimental results confirm the effectiveness of the developed converter system for grid-connected hybrid renewable energy systems as well as for hybrid electric vehicles and other industrial applications.
Resumo:
Current technology permits connecting local networks via high-bandwidth telephone lines. Central coordinator nodes may use Intelligent Networks to manage data flow over dialed data lines, e.g. ISDN, and to establish connections between LANs. This dissertation focuses on cost minimization and on establishing operational policies for query distribution over heterogeneous, geographically distributed databases. Based on our study of query distribution strategies, public network tariff policies, and database interface standards we propose methods for communication cost estimation, strategies for the reduction of bandwidth allocation, and guidelines for central to node communication protocols. Our conclusion is that dialed data lines offer a cost effective alternative for the implementation of distributed database query systems, and that existing commercial software may be adapted to support query processing in heterogeneous distributed database systems. ^
Resumo:
The purpose of this study was to determine the degree to which the Big-Five personality taxonomy, as represented by the Minnesota Multiphasic Personality Inventory (MMPI), California Psychological Inventory (CPI), and Inwald Personality Inventory (IPI) scales, predicted a variety of police officer job performance criteria. Data were collected archivally for 270 sworn police officers from a large Southeastern municipality. Predictive data consisted of scores on the MMPI, CPI, and IPI scales as grouped in terms of the Big-Five factors. The overall score on the Wonderlic was included in order to assess criterion variance accounted for by cognitive ability. Additionally, a psychologist's overall rating of predicted job fit was utilized to assess the variance accounted for by a psychological interview. Criterion data consisted of supervisory ratings of overall job performance, State Examination scores, police academy grades, and termination. Based on the literature, it was hypothesized that officers who are higher on Extroversion, Conscientiousness, Agreeableness, Openness to Experience, and lower on Neuroticism, otherwise known as the Big-Five factors, would outperform their peers across a variety of job performance criteria. Additionally, it was hypothesized that police officers who are higher in cognitive ability and masculinity, and lower in mania would also outperform their counterparts. Results indicated that many of the Big-Five factors, namely, Neuroticism, Conscientiousness, Agreeableness, and Openness to Experience, were predictive of several of the job performance criteria. Such findings imply that the Big-Five is a useful predictor of police officer job performance. Study limitations and implications for future research are discussed. ^
Resumo:
Stable isotope analysis has become a standard ecological tool for elucidating feeding relationships of organisms and determining food web structure and connectivity. There remain important questions concerning rates at which stable isotope values are incorporated into tissues (turnover rates) and the change in isotope value between a tissue and a food source (discrimination values). These gaps in our understanding necessitate experimental studies to adequately interpret field data. Tissue turnover rates and discrimination values vary among species and have been investigated in a broad array of taxa. However, little attention has been paid to ectothermic top predators in this regard. We quantified the turnover rates and discrimination values for three tissues (scutes, red blood cells, and plasma) in American alligators (Alligator mississippiensis). Plasma turned over faster than scutes or red blood cells, but turnover rates of all three tissues were very slow in comparison to those in endothermic species. Alligator δ15N discrimination values were surprisingly low in comparison to those of other top predators and varied between experimental and control alligators. The variability of δ15N discrimination values highlights the difficulties in using δ15N to assign absolute and possibly even relative trophic levels in field studies. Our results suggest that interpreting stable isotope data based on parameter estimates from other species can be problematic and that large ectothermic tetrapod tissues may be characterized by unique stable isotope dynamics relative to species occupying lower trophic levels and endothermic tetrapods.
Resumo:
A high proportion of amphibian species are threatened with extinction globally, and habitat loss and degradation are the most frequently implicated causes. Rapid deforestation for the establishment of agricultural production is a primary driver of habitat loss in tropical zones where amphibian diversity is highest. Land-cover change affects native assemblages, in part, through the reduction of habitat area and the reduction of movement among remnant populations. Decreased gene flow contributes to loss of genetic diversity, which limits the ability of local populations to respond to further environmental changes. The focus of this dissertation is on the degree to which common land uses in Sarapiquí, Costa Rica impede the movement of two common amphibian species. First, I used field experiments, including displacement trials, and a behavioral landscape ecology framework to investigate the resistance of pastures to movement of Oophaga pumilio. Results from experiments demonstrate that pastures do impede movement of O. pumilio relative to forest. Microclimatic effects on movement performance as well as limited perceptual ranges likely contribute to reduced return rates through pastures. Next, I linked local processes to landscape scale estimates of resistance. I conducted experiments to measure habitat-specific costs to movement for O. pumilio and Craugastor bransfodrii, and then used experimental results to parameterize connectivity models. Model validation indicated highest support for resistance estimates generated from responses to land-use specific microclimates for both species and to predator encounters for O. pumilio. Finally, I used abundance and experiment-derived resistance estimates to analyze the effects of prevalent land uses on population genetic structure of the two focal species. While O. pumilio did not exhibit a strong response to landscape heterogeneity and was primarily structured by distances among sites, C. bransfordii genetic variation was explained by resistance estimates from abundance and experiment data. Collectivity, this work demonstrates that common land uses can offer different levels of resistance to amphibian movements in Sarapiquí and illustrates the value of investigating local scales processes to inform interpretation of landscape-scale patterns.^
Resumo:
A high proportion of amphibian species are threatened with extinction globally, and habitat loss and degradation are the most frequently implicated causes. Rapid deforestation for the establishment of agricultural production is a primary driver of habitat loss in tropical zones where amphibian diversity is highest. Land-cover change affects native assemblages, in part, through the reduction of habitat area and the reduction of movement among remnant populations. Decreased gene flow contributes to loss of genetic diversity, which limits the ability of local populations to respond to further environmental changes. The focus of this dissertation is on the degree to which common land uses in Sarapiquí, Costa Rica impede the movement of two common amphibian species. First, I used field experiments, including displacement trials, and a behavioral landscape ecology framework to investigate the resistance of pastures to movement of Oophaga pumilio. Results from experiments demonstrate that pastures do impede movement of O. pumilio relative to forest. Microclimatic effects on movement performance as well as limited perceptual ranges likely contribute to reduced return rates through pastures. Next, I linked local processes to landscape scale estimates of resistance. I conducted experiments to measure habitat-specific costs to movement for O. pumilio and Craugastor bransfodrii, and then used experimental results to parameterize connectivity models. Model validation indicated highest support for resistance estimates generated from responses to land-use specific microclimates for both species and to predator encounters for O. pumilio. Finally, I used abundance and experiment-derived resistance estimates to analyze the effects of prevalent land uses on population genetic structure of the two focal species. While O. pumilio did not exhibit a strong response to landscape heterogeneity and was primarily structured by distances among sites, C. bransfordii genetic variation was explained by resistance estimates from abundance and experiment data. Collectivity, this work demonstrates that common land uses can offer different levels of resistance to amphibian movements in Sarapiquí and illustrates the value of investigating local scales processes to inform interpretation of landscape-scale patterns.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.