895 resultados para Other Computer Engineering
Resumo:
Much of the bridge stock on major transport links in North America and Europe was constructed in the 1950s and 1960s and has since deteriorated or is carrying loads far in excess of the original design loads. Structural Health Monitoring Systems (SHM) can provide valuable information on the bridge capacity but the application of such systems is currently limited by access and bridge type. This paper investigates the use of computer vision systems for SHM. A series of field tests have been carried out to test the accuracy of displacement measurements using contactless methods. A video image of each test was processed using a modified version of the optical flow tracking method to track displacement. These results have been validated with an established measurement method using linear variable differential transformers (LVDTs). The results obtained from the algorithm provided an accurate comparison with the validation measurements. The calculated displacements agree within 2% of the verified LVDT measurements, a number of post processing methods were then applied to attempt to reduce this error.
Resumo:
This article describes the design and implementation of computer-aided tool called Relational Algebra Translator (RAT) in data base courses, for the teaching of relational algebra. There was a problem when introducing the relational algebra topic in the course EIF 211 Design and Implementation of Databases, which belongs to the career of Engineering in Information Systems of the National University of Costa Rica, because students attending this course were lacking profound mathematical knowledge, which led to a learning problem, being this an important subject to understand what the data bases search and request do RAT comes along to enhance the teaching-learning process.It introduces the architectural and design principles required for its implementation, such as: the language symbol table, the gramatical rules and the basic algorithms that RAT uses to translate from relational algebra to SQL language. This tool has been used for one periods and has demonstrated to be effective in the learning-teaching process. This urged investigators to publish it in the web site: www.slinfo.una.ac.cr in order for this tool to be used in other university courses.
Resumo:
When designing a new passenger ship or naval vessel or modifying an existing design, how do we ensure that the proposed design is safe from an evacuation point of view? In the wake of major maritime disasters such as the Herald of Free Enterprise and the Estonia and in light of the growth in the numbers of high density, high-speed ferries and large capacity cruise ships, issues concerned with the evacuation of passengers and crew at sea are receiving renewed interest. In the maritime industry, ship evacuation models are now recognised by IMO through the publication of the Interim Guidelines for Evacuation Analysis of New and Existing Passenger Ships including Ro-Ro. This approach offers the promise to quickly and efficiently bring evacuation considerations into the design phase, while the ship is "on the drawing board" as well as reviewing and optimising the evacuation provision of the existing fleet. Other applications of this technology include the optimisation of operating procedures for civil and naval vessels such as determining the optimal location of a feature such as a casino, organising major passenger movement events such as boarding/disembarkation or restaurant/theatre changes, determining lean manning requirements, location and number of damage control parties, etc. This paper describes the development of the maritimeEXODUS evacuation model which is fully compliant with IMO requirements and briefly presents an example application to a large passenger ferry.
Reservoir system analysis, conservation : Hydrologic Engineering Center computer program 23-J2-L253.
Resumo:
At head of cover title: Generalized computer program.
Resumo:
Support Vector Machines (SVMs) are widely used classifiers for detecting physiological patterns in Human-Computer Interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the application of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables, and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.
Resumo:
Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, focus groups, surveys, usability tests, case studies, diary studies, ethnography, contextual inquiry, experience sampling, and automated data collection. In this paper, we report on our experience using the evaluation methods focus groups, surveys and interviews and how we adopted these methods to develop artefacts: either interface’s design or information and technological systems. Four projects are examples of the different methods application to gather information about user’s wants, habits, practices, concerns and preferences. The goal was to build an understanding of the attitudes and satisfaction of the people who might interact with a technological artefact or information system. Conversely, we intended to design for information systems and technological applications, to promote resilience in organisations (a set of routines that allow to recover from obstacles) and user’s experiences. Organisations can here also be viewed within a system approach, which means that the system perturbations even failures could be characterized and improved. The term resilience has been applied to everything from the real estate, to the economy, sports, events, business, psychology, and more. In this study, we highlight that resilience is also made up of a number of different skills and abilities (self-awareness, creating meaning from other experiences, self-efficacy, optimism, and building strong relationships) that are a few foundational ingredients, which people should use along with the process of enhancing an organisation’s resilience. Resilience enhances knowledge of resources available to people confronting existing problems.
Resumo:
The computer controlled screwdriver is a modern technique to perform automatic screwing/unscrewing operations.The main focus is to study the integration of the computer controlled screwdriver for Robotic manufacturing in the ROS environment.This thesis describes a concept of automatic screwing mechanism composed by universal robots, in which one arm of the robot is for inserting cables and the other is for screwing the cables on the control panel switch gear box. So far this mechanism is carried out by human operators and is a fairly complex one to perform, due to the multiple cables and connections involved. It's for this reason that an automatic cabling and screwing process would be highly preferred within automotive/automation industries. A study is carried out to analyze the difficulties currently faced and a controller based algorithm is developed to replace the manual human efforts using universal robots, thereby allowing robot arms to insert the cables and screw them onto the control panel switch gear box. Experiments were conducted to evaluate the insertion and screwing strategy, which shows the result of inserting and screwing cables on the control panel switch gearbox precisely.
Resumo:
Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.
Resumo:
Ecosystem engineering is increasingly recognized as a relevant ecological driver of diversity and community composition. Although engineering impacts on the biota can vary from negative to positive, and from trivial to enormous, patterns and causes of variation in the magnitude of engineering effects across ecosystems and engineer types remain largely unknown. To elucidate the above patterns, we conducted a meta-analysis of 122 studies which explored effects of animal ecosystem engineers on species richness of other organisms in the community. The analysis revealed that the overall effect of ecosystem engineers on diversity is positive and corresponds to a 25% increase in species richness, indicating that ecosystem engineering is a facilitative process globally. Engineering effects were stronger in the tropics than at higher latitudes, likely because new or modified habitats provided by engineers in the tropics may help minimize competition and predation pressures on resident species. Within aquatic environments, engineering impacts were stronger in marine ecosystems (rocky shores) than in streams. In terrestrial ecosystems, engineers displayed stronger positive effects in arid environments (e.g. deserts). Ecosystem engineers that create new habitats or microhabitats had stronger effects than those that modify habitats or cause bioturbation. Invertebrate engineers and those with lower engineering persistence (<1 year) affected species richness more than vertebrate engineers which persisted for >1 year. Invertebrate species richness was particularly responsive to engineering impacts. This study is the first attempt to build an integrative framework of engineering effects on species diversity; it highlights the importance of considering latitude, habitat, engineering functional group, taxon and persistence of their effects in future theoretical and empirical studies.
Resumo:
This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.
Resumo:
Currently, the acoustic and nanoindentation techniques are two of the most used techniques for material elastic modulus measurement. In this article fundamental principles and limitations of both techniques are shown and discussed. Last advances in nanoindentation technique are also reviewed. An experimental study in ceramic, metallic, composite and single crystals was also done. Results shown that ultrasonic technique is capable to provide results in agreement with those reported in literature. However, ultrasonic technique does not allow measuring the elastic modulus of some small samples and single crystals. On the other hand, the nanoindentation technique estimates the elastic modulus values in reasonable agreement with those measured by acoustic methods, particularly in amorphous materials, while in some policristaline materials some deviation from expected values was obtained.
Resumo:
As previously shown, higher levels of NOTCH1 and increased NF-kappa B signaling is a distinctive feature of the more primitive umbilical cord blood (UCB) CD34+ hematopoietic stem cells (HSCs), as compared to bone marrow ( BM). Differences between BM and UCB cell composition also account for this finding. The CD133 marker defines a more primitive cell subset among CD34+ HSC with a proposed hemangioblast potential. To further evaluate the molecular basis related to the more primitive characteristics of UCB and CD133+ HSC, immunomagnetically purified human CD34+ and CD133+ cells from BM and UCB were used on gene expression microarrays studies. UCB CD34+ cells contained a significantly higher proportion of CD133+ cells than BM (70% and 40%, respectively). Cluster analysis showed that BM CD133+ cells grouped with the UCB cells ( CD133+ and CD34+) rather than to BM CD34+ cells. Compared with CD34+ cells, CD133+ had a higher expression of many transcription factors (TFs). Promoter analysis on all these TF genes revealed a significantly higher frequency ( than expected by chance) of NF-kappa B-binding sites (BS), including potentially novel NF-kappa B targets such as RUNX1, GATA3, and USF1. Selected transcripts of TF related to primitive hematopoiesis and self-renewal, such as RUNX1, GATA3, USF1, TAL1, HOXA9, HOXB4, NOTCH1, RELB, and NFKB2 were evaluated by real-time PCR and were all significantly positively correlated. Taken together, our data indicate the existence of an interconnected transcriptional network characterized by higher levels of NOTCH1, NF-kappa B, and other important TFs on more primitive HSC sets.
Resumo:
Shallow subsurface layers of gold nanoclusters were formed in polymethylmethacrylate (PMMA) polymer by very low energy (49 eV) gold ion implantation. The ion implantation process was modeled by computer simulation and accurately predicted the layer depth and width. Transmission electron microscopy (TEM) was used to image the buried layer and individual nanoclusters; the layer width was similar to 6-8 nm and the cluster diameter was similar to 5-6 nm. Surface plasmon resonance (SPR) absorption effects were observed by UV-visible spectroscopy. The TEM and SPR results were related to prior measurements of electrical conductivity of Au-doped PMMA, and excellent consistency was found with a model of electrical conductivity in which either at low implantation dose the individual nanoclusters are separated and do not physically touch each other, or at higher implantation dose the nanoclusters touch each other to form a random resistor network (percolation model). (C) 2009 American Vacuum Society. [DOI: 10.1116/1.3231449]
Resumo:
Thanks to recent advances in molecular biology, allied to an ever increasing amount of experimental data, the functional state of thousands of genes can now be extracted simultaneously by using methods such as cDNA microarrays and RNA-Seq. Particularly important related investigations are the modeling and identification of gene regulatory networks from expression data sets. Such a knowledge is fundamental for many applications, such as disease treatment, therapeutic intervention strategies and drugs design, as well as for planning high-throughput new experiments. Methods have been developed for gene networks modeling and identification from expression profiles. However, an important open problem regards how to validate such approaches and its results. This work presents an objective approach for validation of gene network modeling and identification which comprises the following three main aspects: (1) Artificial Gene Networks (AGNs) model generation through theoretical models of complex networks, which is used to simulate temporal expression data; (2) a computational method for gene network identification from the simulated data, which is founded on a feature selection approach where a target gene is fixed and the expression profile is observed for all other genes in order to identify a relevant subset of predictors; and (3) validation of the identified AGN-based network through comparison with the original network. The proposed framework allows several types of AGNs to be generated and used in order to simulate temporal expression data. The results of the network identification method can then be compared to the original network in order to estimate its properties and accuracy. Some of the most important theoretical models of complex networks have been assessed: the uniformly-random Erdos-Renyi (ER), the small-world Watts-Strogatz (WS), the scale-free Barabasi-Albert (BA), and geographical networks (GG). The experimental results indicate that the inference method was sensitive to average degree k variation, decreasing its network recovery rate with the increase of k. The signal size was important for the inference method to get better accuracy in the network identification rate, presenting very good results with small expression profiles. However, the adopted inference method was not sensible to recognize distinct structures of interaction among genes, presenting a similar behavior when applied to different network topologies. In summary, the proposed framework, though simple, was adequate for the validation of the inferred networks by identifying some properties of the evaluated method, which can be extended to other inference methods.
Resumo:
An implementation of a computational tool to generate new summaries from new source texts is presented, by means of the connectionist approach (artificial neural networks). Among other contributions that this work intends to bring to natural language processing research, the use of a more biologically plausible connectionist architecture and training for automatic summarization is emphasized. The choice relies on the expectation that it may bring an increase in computational efficiency when compared to the sa-called biologically implausible algorithms.