888 resultados para Supervisory Control and Data Acquisition (SCADA) Topology
Resumo:
Infrastructure spatial data, such as the orientation and the location of in place structures and these structures' boundaries and areas, play a very important role for many civil infrastructure development and rehabilitation applications, such as defect detection, site planning, on-site safety assistance and others. In order to acquire these data, a number of modern optical-based spatial data acquisition techniques can be used. These techniques are based on stereo vision, optics, time of flight, etc., and have distinct characteristics, benefits and limitations. The main purpose of this paper is to compare these infrastructure optical-based spatial data acquisition techniques based on civil infrastructure application requirements. In order to achieve this goal, the benefits and limitations of these techniques were identified. Subsequently, these techniques were compared according to applications' requirements, such as spatial accuracy, the automation of acquisition, the portability of devices and others. With the help of this comparison, unique characteristics of these techniques were identified so that practitioners will be able to select an appropriate technique for their own applications.
Resumo:
为了解决ARCNET网络与以太网不兼容的问题,针对目前ARCNET网络设备监控管理系统存在的缺陷,提出了一种基于嵌入式TCP/IP协议的ARCNET数据采集与传输系统。分析了该数据采集系统的原理与结构,给出了系统的硬件设计方案,完成了数据采集与传输的软件结构设计和嵌入式TCP/IP协议栈的建立。对系统的实时性、可靠性和应用效果等进行了测试,结果证明,系统使用方便,性能稳定,具有良好的实时性和可靠性,综合性能优于现有的ARCNET数据采集系统。
Resumo:
A model is presented that deals with problems of motor control, motor learning, and sensorimotor integration. The equations of motion for a limb are parameterized and used in conjunction with a quantized, multi-dimensional memory organized by state variables. Descriptions of desired trajectories are translated into motor commands which will replicate the specified motions. The initial specification of a movement is free of information regarding the mechanics of the effector system. Learning occurs without the use of error correction when practice data are collected and analyzed.
Resumo:
In this paper we examine a number of admission control and scheduling protocols for high-performance web servers based on a 2-phase policy for serving HTTP requests. The first "registration" phase involves establishing the TCP connection for the HTTP request and parsing/interpreting its arguments, whereas the second "service" phase involves the service/transmission of data in response to the HTTP request. By introducing a delay between these two phases, we show that the performance of a web server could be potentially improved through the adoption of a number of scheduling policies that optimize the utilization of various system components (e.g. memory cache and I/O). In addition, to its premise for improving the performance of a single web server, the delineation between the registration and service phases of an HTTP request may be useful for load balancing purposes on clusters of web servers. We are investigating the use of such a mechanism as part of the Commonwealth testbed being developed at Boston University.
Resumo:
This paper describes the development of neural model-based control strategies for the optimisation of an industrial aluminium substrate disk grinding process. The grindstone removal rate varies considerably over a stone life and is a highly nonlinear function of process variables. Using historical grindstone performance data, a NARX-based neural network model is developed. This model is then used to implement a direct inverse controller and an internal model controller based on the process settings and previous removal rates. Preliminary plant investigations show that thickness defects can be reduced by 50% or more, compared to other schemes employed. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Background. The assembly of the tree of life has seen significant progress in recent years but algae and protists have been largely overlooked in this effort. Many groups of algae and protists have ancient roots and it is unclear how much data will be required to resolve their phylogenetic relationships for incorporation in the tree of life. The red algae, a group of primary photosynthetic eukaryotes of more than a billion years old, provide the earliest fossil evidence for eukaryotic multicellularity and sexual reproduction. Despite this evolutionary significance, their phylogenetic relationships are understudied. This study aims to infer a comprehensive red algal tree of life at the family level from a supermatrix containing data mined from GenBank. We aim to locate remaining regions of low support in the topology, evaluate their causes and estimate the amount of data required to resolve them. Results. Phylogenetic analysis of a supermatrix of 14 loci and 98 red algal families yielded the most complete red algal tree of life to date. Visualization of statistical support showed the presence of five poorly supported regions. Causes for low support were identified with statistics about the age of the region, data availability and node density, showing that poor support has different origins in different parts of the tree. Parametric simulation experiments yielded optimistic estimates of how much data will be needed to resolve the poorly supported regions (ca. 103 to ca. 104 nucleotides for the different regions). Nonparametric simulations gave a markedly more pessimistic image, some regions requiring more than 2.8 105 nucleotides or not achieving the desired level of support at all. The discrepancies between parametric and nonparametric simulations are discussed in light of our dataset and known attributes of both approaches. Conclusions. Our study takes the red algae one step closer to meaningful inclusion in the tree of life. In addition to the recovery of stable relationships, the recognition of five regions in need of further study is a significant outcome of this work. Based on our analyses of current availability and future requirements of data, we make clear recommendations for forthcoming research.
Design, recruitment, logistics, and data management of the GEHA (Genetics of Healthy Ageing) project
Resumo:
In 2004, the integrated European project GEHA (Genetics of Healthy Ageing) was initiated with the aim of identifying genes involved in healthy ageing and longevity. The first step in the project was the recruitment of more than 2500 pairs of siblings aged 90 years or more together with one younger control person from 15 areas in 11 European countries through a coordinated and standardised effort. A biological sample, preferably a blood sample, was collected from each participant, and basic physical and cognitive measures were obtained together with information about health, life style, and family composition. From 2004 to 2008 a total of 2535 families comprising 5319 nonagenarian siblings were identified and included in the project. In addition, 2548 younger control persons aged 50-75 years were recruited. A total of 2249 complete trios with blood samples from at least two old siblings and the younger control were formed and are available for genetic analyses (e.g. linkage studies and genome-wide association studies). Mortality follow-up improves the possibility of identifying families with the most extreme longevity phenotypes. With a mean follow-up time of 3.7 years the number of families with all participating siblings aged 95 years or more has increased by a factor of 5 to 750 families compared to when interviews were conducted. Thus, the GEHA project represents a unique source in the search for genes related to healthy ageing and longevity.
Outperformance in exchange-traded fund pricing deviations: Generalized control of data snooping bias
Resumo:
An investigation into exchange-traded fund (ETF) outperforrnance during the period 2008-2012 is undertaken utilizing a data set of 288 U.S. traded securities. ETFs are tested for net asset value (NAV) premium, underlying index and market benchmark outperformance, with Sharpe, Treynor, and Sortino ratios employed as risk-adjusted performance measures. A key contribution is the application of an innovative generalized stepdown procedure in controlling for data snooping bias. We find that a large proportion of optimized replication and debt asset class ETFs display risk-adjusted premiums with energy and precious metals focused funds outperforming the S&P 500 market benchmark.
Resumo:
Aim: to evaluate the effects of a 12-weeks combined aerobic-resistance exercise therapy on fatigue and isokinetic muscle strength, glycemic control and health-related quality of life (HRQoL) in moderately affected type 2 diabetes (T2DM) patients. Methods: a randomized controlled trial design was employed. Forty-three T2DM patients were assigned to an exercise group (n = 22), performing 3 weekly sessions of 60 minutes of combined aerobic-resistance exercise for 12-weeks; or a no exercise control group (n = 21). Both groups were evaluated at a baseline and after 12-weeks of exercise therapy for: 1) muscle strength and fatigue by isokinetic dynamometry; 2) plasma glycated hemoglobin A1C (HbA1C); and 3) HRQoL utilizing the SF-36 questionnaire. Results: the exercise therapy led to improvements in muscle fatigue in knee extensors (-55%) and increased muscle strength in knee flexors and extensors (+15 to +30%), while HbA1C decreased (-18%). In addition, the exercising patients showed sizeable improvements in HRQoL: physical function (+53%), vitality (+21%) and mental health (+40%). Conclusion: 12-weeks of combined aerobic-resistance exercise was highly effective to improve muscle strength and fatigue, glycemic control and several aspects of HRQoL in T2DM patients. These data encourage the use of aerobic and resistance exercise in the good clinical care of T2DM.
Resumo:
Adhesive bonding is nowadays a serious candidate to replace methods such as fastening or riveting, because of attractive mechanical properties. As a result, adhesives are being increasingly used in industries such as the automotive, aerospace and construction. Thus, it is highly important to predict the strength of bonded joints to assess the feasibility of joining during the fabrication process of components (e.g. due to complex geometries) or for repairing purposes. This work studies the tensile behaviour of adhesive joints between aluminium adherends considering different values of adherend thickness (h) and the double-cantilever beam (DCB) test. The experimental work consists of the definition of the tensile fracture toughness (GIC) for the different joint configurations. A conventional fracture characterization method was used, together with a J-integral approach, that take into account the plasticity effects occurring in the adhesive layer. An optical measurement method is used for the evaluation of crack tip opening and adherends rotation at the crack tip during the test, supported by a Matlab® sub-routine for the automated extraction of these quantities. As output of this work, a comparative evaluation between bonded systems with different values of adherend thickness is carried out and complete fracture data is provided in tension for the subsequent strength prediction of joints with identical conditions.
Resumo:
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.
Resumo:
The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.
Resumo:
La version intégrale de cette thèse est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l’Université de Montréal (http://www.bib.umontreal.ca/MU).
Resumo:
Oil-based formulated conidia sprayed on steel plates and conidia powder (control) of Beauveria bassiana isolate IMI 386243 were stored at temperatures from 10 to 40 degrees C in desiccators over saturated salt solutions providing relative humidities from 32 to 88%, or in hermetic storage at 40 degrees C, and moisture contents in equilibrium with 33 or 77% relative humidity. The negative semi-logarithmic relation (P < 0.005) between conidia longevity (at 40 degrees C) and equilibrium relative humidity did not differ (P > 0.25) between formulated conidia and conidia powder. Despite this, certain saturated salts provided consistently greater longevity (NaCl) and others consistently shorter longevity (KCl) for formulated conidia compared to conidia powder. These results, analysis of previous data, and comparison with hermetic storage, indicate that storage of conidia over saturated salt solutions provides inconsistent responses to environment and so may be problematic for bio-pesticide research. In hermetic storage, oil formulation was not deleterious to longevity and in the more moist environment enhanced survival periods. (c) 2005 Elsevier Inc. All rights reserved.