893 resultados para Computer Engineering
Resumo:
Much of the bridge stock on major transport links in North America and Europe was constructed in the 1950’s and 1960’s and has since deteriorated or is carrying loads far in excess of the original design loads. Structural Health Monitoring Systems (SHM) can provide valuable information on the bridge capacity but the application of such systems is currently limited by access and system cost. This paper investigates the development of a low cost portable SHM system using commercially available cameras and computer vision techniques. A series of laboratory tests have been carried out to test the accuracy of displacement measurements using contactless methods. The results from each of the tests have been validated with established measurement methods, such as linear variable differential transformers (LVDTs). A video image of each test was processed using two different digital image correlation programs. The results obtained from the digital image correlation methods provided an accurate comparison with the validation measurements. The calculated displacements agree within 4% of the verified measurements LVDT measurements in most cases confirming the suitability full camera based SHM systems
Resumo:
This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)
Resumo:
This paper aims to crystallize recent research performed at the University of Worcester to investigate the feasibility of using the commercial game engine ‘Unreal Tournament 2004’ (UT2004) to produce ‘Educational Immersive Environments’ (EIEs) suitable for education and training. Our research has been supported by the UK Higher Education Academy. We discuss both practical and theoretical aspects of EIEs. The practical aspects include the production of EIEs to support high school physics education, the education of architects, and the learning of literacy by primary school children. This research is based on the development of our novel instructional medium, ‘UnrealPowerPoint’. Our fundamental guiding principles are that, first, pedagogy must inform technology, and second, that both teachers and pupils should be empowered to produce educational materials. Our work is informed by current educational theories such as constructivism, experiential learning and socio-cultural approaches as well as elements of instructional design and game principles.
Resumo:
Background. Tremendous advances in biomaterials science and nanotechnologies, together with thorough research on stem cells, have recently promoted an intriguing development of regenerative medicine/tissue engineering. The nanotechnology represents a wide interdisciplinary field that implies the manipulation of different materials at nanometer level to achieve the creation of constructs that mimic the nanoscale-based architecture of native tissues. Aim. The purpose of this article is to highlight the significant new knowledges regarding this matter. Emerging acquisitions. To widen the range of scaffold materials resort has been carried out to either recombinant DNA technology-generated materials, such as a collagen-like protein, or the incorporation of bioactive molecules, such as RDG (arginine-glycine-aspartic acid), into synthetic products. Both the bottom-up and the top-down fabrication approaches may be properly used to respectively obtain sopramolecular architectures or, instead, micro-/nanostructures to incorporate them within a preexisting complex scaffold construct. Computer-aided design/manufacturing (CAD/CAM) scaffold technique allows to achieve patient-tailored organs. Stem cells, because of their peculiar properties - ability to proliferate, self-renew and specific cell-lineage differentiate under appropriate conditions - represent an attractive source for intriguing tissue engineering/regenerative medicine applications. Future research activities. New developments in the realization of different organs tissue engineering will depend on further progress of both the science of nanoscale-based materials and the knowledge of stem cell biology. Moreover the in vivo tissue engineering appears to be the logical step of the current research.
Resumo:
El trabajo plantea un aporte al framework de ingeniería social (The Social Engineering Framework) para la evaluación del riesgo y mitigación de distintos vectores de ataque, por medio del análisis de árboles de ataque -- Adicionalmente se muestra una recopilación de estadísticas de ataques realizados a compañías de diferentes industrias relacionadas con la seguridad informática, enfocado en los ataques de ingeniería social y las consecuencias a las que se enfrentan las organizaciones -- Se acompañan las estadísticas con la descripción de ejemplos reales y sus consecuencias
Resumo:
This article describes the design and implementation of computer-aided tool called Relational Algebra Translator (RAT) in data base courses, for the teaching of relational algebra. There was a problem when introducing the relational algebra topic in the course EIF 211 Design and Implementation of Databases, which belongs to the career of Engineering in Information Systems of the National University of Costa Rica, because students attending this course were lacking profound mathematical knowledge, which led to a learning problem, being this an important subject to understand what the data bases search and request do RAT comes along to enhance the teaching-learning process.It introduces the architectural and design principles required for its implementation, such as: the language symbol table, the gramatical rules and the basic algorithms that RAT uses to translate from relational algebra to SQL language. This tool has been used for one periods and has demonstrated to be effective in the learning-teaching process. This urged investigators to publish it in the web site: www.slinfo.una.ac.cr in order for this tool to be used in other university courses.
Resumo:
When designing a new passenger ship or naval vessel or modifying an existing design, how do we ensure that the proposed design is safe from an evacuation point of view? In the wake of major maritime disasters such as the Herald of Free Enterprise and the Estonia and in light of the growth in the numbers of high density, high-speed ferries and large capacity cruise ships, issues concerned with the evacuation of passengers and crew at sea are receiving renewed interest. In the maritime industry, ship evacuation models are now recognised by IMO through the publication of the Interim Guidelines for Evacuation Analysis of New and Existing Passenger Ships including Ro-Ro. This approach offers the promise to quickly and efficiently bring evacuation considerations into the design phase, while the ship is "on the drawing board" as well as reviewing and optimising the evacuation provision of the existing fleet. Other applications of this technology include the optimisation of operating procedures for civil and naval vessels such as determining the optimal location of a feature such as a casino, organising major passenger movement events such as boarding/disembarkation or restaurant/theatre changes, determining lean manning requirements, location and number of damage control parties, etc. This paper describes the development of the maritimeEXODUS evacuation model which is fully compliant with IMO requirements and briefly presents an example application to a large passenger ferry.
Resumo:
Abstract not available
Resumo:
Model Driven based approach for Service Evolution in Clouds will mainly focus on the reusable evolution patterns' advantage to solve evolution problems. During the process, evolution pattern will be driven by MDA models to pattern aspects. Weaving the aspects into service based process by using Aspect-Oriented extended BPEL engine at runtime will be the dynamic feature of the evolution.
Resumo:
Nanocrystalline samples of Ba1-xCaxF2 prepared by high-energy milling show an unusually high F-ion conductivity, which exhibit a maximum in the magnitude and a minimum in the activation energy at x = 0.5. Here, we report an X-ray absorption spectroscopy (XAS) at the Ca and Sr K edges and the Ba L-3 edge and a molecular dynamics (MD) simulation study of the pure and mixed fluorides. The XAS measurements on the pure binary fluorides, CaF2, SrF2 and BaF2 show that high-energy ball-milling produces very little amorphous material, in contrast to the results for ball milled oxides. XAS measurements of Ba1-xCaxF2 reveal that for 0 < x < 1 there is considerable disorder in the local environments of the cations which is highest for x = 0.5. Hence the maximum in the conductivity corresponds to the composition with the maximum level of local disorder. The MD calculations also show a highly disordered structure consistent with the XAS results and similarly showing maximum disorder at x = 0.5.
Resumo:
Trabalho apresentado em PAEE/ALE’2016, 8th International Symposium on Project Approaches in Engineering Education (PAEE) and 14th Active Learning in Engineering Education Workshop (ALE)
Resumo:
Part 6: Engineering and Implementation of Collaborative Networks
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment (“relaxation” vs. “stress”) are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the “relaxation” vs. “stress” states.
Resumo:
Support Vector Machines (SVMs) are widely used classifiers for detecting physiological patterns in Human-Computer Interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the application of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables, and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.