960 resultados para Computer Engineering|Electrical engineering
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Projeto de Graduação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de licenciada em Enfermagem
Resumo:
A dissertation submitted in fulfillment of the requirements to the degree of Master in Computer Science and Computer Engineering
Resumo:
The Pierre Auger Cosmic Ray Observatory North site employs a large array of surface detector stations (tanks) to detect the secondary particle showers generated by ultra-high energy cosmic rays. Due to the rare nature of ultra-high energy cosmic rays, it is important to have a high reliability on tank communications, ensuring no valuable data is lost. The Auger North site employs a peer-to-peer paradigm, the Wireless Architecture for Hard Real-Time Embedded Networks (WAHREN), designed specifically for highly reliable message delivery over fixed networks, under hard real-time deadlines. The WAHREN design included two retransmission protocols, Micro- and Macro- retransmission. To fully understand how each retransmission protocol increased the reliability of communications, this analysis evaluated the system without using either retransmission protocol (Case-0), both Micro- and Macro-retransmission individually (Micro and Macro), and Micro- and Macro-retransmission combined. This thesis used a multimodal modeling methodology to prove that a performance and reliability analysis of WAHREN was possible, and provided the results of the analysis. A multimodal approach was necessary because these processes were driven by different mathematical models. The results from this analysis can be used as a framework for making design decisions for the Auger North communication system.
Resumo:
By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.
Resumo:
Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.
Resumo:
Mobile sensor networks have unique advantages compared with wireless sensor networks. The mobility enables mobile sensors to flexibly reconfigure themselves to meet sensing requirements. In this dissertation, an adaptive sampling method for mobile sensor networks is presented. Based on the consideration of sensing resource constraints, computing abilities, and onboard energy limitations, the adaptive sampling method follows a down sampling scheme, which could reduce the total number of measurements, and lower sampling cost. Compressive sensing is a recently developed down sampling method, using a small number of randomly distributed measurements for signal reconstruction. However, original signals cannot be reconstructed using condensed measurements, as addressed by Shannon Sampling Theory. Measurements have to be processed under a sparse domain, and convex optimization methods should be applied to reconstruct original signals. Restricted isometry property would guarantee signals can be recovered with little information loss. While compressive sensing could effectively lower sampling cost, signal reconstruction is still a great research challenge. Compressive sensing always collects random measurements, whose information amount cannot be determined in prior. If each measurement is optimized as the most informative measurement, the reconstruction performance can perform much better. Based on the above consideration, this dissertation is focusing on an adaptive sampling approach, which could find the most informative measurements in unknown environments and reconstruct original signals. With mobile sensors, measurements are collect sequentially, giving the chance to uniquely optimize each of them. When mobile sensors are about to collect a new measurement from the surrounding environments, existing information is shared among networked sensors so that each sensor would have a global view of the entire environment. Shared information is analyzed under Haar Wavelet domain, under which most nature signals appear sparse, to infer a model of the environments. The most informative measurements can be determined by optimizing model parameters. As a result, all the measurements collected by the mobile sensor network are the most informative measurements given existing information, and a perfect reconstruction would be expected. To present the adaptive sampling method, a series of research issues will be addressed, including measurement evaluation and collection, mobile network establishment, data fusion, sensor motion, signal reconstruction, etc. Two dimensional scalar field will be reconstructed using the method proposed. Both single mobile sensors and mobile sensor networks will be deployed in the environment, and reconstruction performance of both will be compared.In addition, a particular mobile sensor, a quadrotor UAV is developed, so that the adaptive sampling method can be used in three dimensional scenarios.
Resumo:
Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.
Resumo:
La Web 2.0 ha tenido un enorme éxito gracias a la posibilidad de una interacción dinámica por parte del usuario, ya no sólo a la hora de participar en elementos colaborativos, como puedan ser los foros, sino en compartir/añadir contenido a la Web. Dos ejemplos claros de este paradigma son YouTube y Flickr. El primero hospeda la mayor parte de los vídeos que podemos encontrar en Internet, y el segundo ha creado la mayor comunidad de fotógrafos existente en la red. Ambos servicios funcionan de una forma similar, el usuario es el que aporta contenidos junto a una información asociada al mismo. Al ser comunidades internacionales, la información añadida por el usuario se realiza en diversos idiomas, por lo que la búsqueda de recursos multimedia en estos sitios es dependiente del idioma de la consulta. En este artículo, presentamos Babxel, un sistema de recuperación de información multimedia y multilingüe, nacido como proyecto de fin de carrera de Ingeniería Informática, como extensión y mejora de FlickrBabel. Babxel aprovecha la capacidad de traducción multilingüe automática para generar más resultados de búsqueda relacionado con la consulta del usuario, resultados que se obtienen de las plataformas mencionadas anteriormente.
Resumo:
Errata, as of Apr. 1, 1950 (7 ¹., 2 diagrs.) inserted.
Resumo:
An introductory course in probability and statistics for third-year and fourth-year electrical engineering students is described. The course is centered around several computer-based projects that are designed to achieve two objectives. First, the projects illustrate the course topics and provide hands-on experience for the students. The second and equally important objective of the projects is to convey the relevance and usefulness of probability and statistics to practical problems that undergraduate students can appreciate. The benefit of this course as to motivate electrical engineering students to excel in the study of probability concepts, instead of viewing the subject as one more course requirement toward graduation. The authors co-teach the course, and MATLAB is used for mast of the computer-based projects
Resumo:
The finite element method is now well established among engineers as being an extremely useful tool in the analysis of problems with complicated boundary conditions. One aim of this thesis has been to produce a set of computer algorithms capable of efficiently analysing complex three dimensional structures. This set of algorithms has been designed to permit much versatility. Provisions such as the use of only those parts of the system which are relevant to a given analysis and the facility to extend the system by the addition of new elements are incorporate. Five element types have been programmed, these are, prismatic members, rectangular plates, triangular plates and curved plates. The 'in and out of plane' stiffness matrices for a curved plate element are derived using the finite element technique. The performance of this type of element is compared with two other theoretical solutions as well as with a set of independent experimental observations. Additional experimental work was then carried out by the author to further evaluate the acceptability of this element. Finally the analysis of two large civil engineering structures, the shell of an electrical precipitator and a concrete bridge, are presented to investigate the performance of the algorithms. Comparisons are made between the computer time, core store requirements and the accuracy of the analysis, for the proposed system and those of another program.
Resumo:
Computational Intelligence (CI) includes four main areas: Evolutionary Computation (genetic algorithms and genetic programming), Swarm Intelligence, Fuzzy Systems and Neural Networks. This article shows how CI techniques overpass the strict limits of Artificial Intelligence field and can help solving real problems from distinct engineering areas: Mechanical, Computer Science and Electrical Engineering.