876 resultados para Computer software - Development
Resumo:
主要讨论"面向方面软件开发"或"面向方面编程"要如何运用形式化的相关方法来进行模型检测。简单介绍面向方面软件开发的内容,并运用编译器的理论知识来分析面向方面编程相关工具的应用。解释面向方面软件开发在测试代码工作上容易遇到的困难点与常见问题,并解释如何运用已知形式化方法来分析描述这些问题,进行模型检测(model checking),找出代码出错的问题点,阐述如何让面向方面软件开发出来的代码更加强固、稳定与可靠。
Resumo:
ACM SIGGRAPH; ACM SIGCHI
Resumo:
随着控制技术、计算机技术、通信技术的飞速发展,现场总线技术正逐渐取代集散控制技术。PROFIBUS现场总线技术作为应用最为广泛的现场总线技术之一,截止至2008年8月,其安装节点已经超过2500万个,市场份额超过20%。 本文首先深入研究了现场总线国际标准IEC61158-Type3(PROFIBUS标准),分析了PFOFIBUS 现场总线的层次结构模型、各层功能及层间通信关系,然后介绍了PROFIBUS PA的功能块模型,包括块的构成、分类、参数详解,以及把功能块模型映射到模块化DP从站设备的规则等。然后本文采用uC/OS-II嵌入式实时操作系统作为软件平台,应用瑞萨公司M16C/62P系列MCU与中科院沈阳自动化研究所自主研发的FBC0409现场总线通信控制器作为硬件平台,开发了支持PROFIBUS DPV0和DP V1的通信协议栈,并根据PA应用行规(V3.01)编写了相应的功能块程序,并且进行了验证。 本文给出了PROFIBUS DPV0、DPV1实现过程中的主要数据和功能定义,介绍了各层任务的创建初始化,处理过程,绘制了相关的软件流程图。本文给出了编写物理块、功能块、变换块和设备管理的参数定义和读写函数的方法。
Resumo:
为了实现可重构装配系统的快速重构能力,在分析了传统控制结构的基础上,结合分层递阶和分布式控制的优点,提出了基于多智能体系统的混合控制体系结构。这种结构具有低复杂性、强鲁棒性、易维护性、可扩展性以及可重用性等特点,能够适应重构系统的要求。在遵守智能物理代理基金会规范的前提下,以Java智能体开发平台进行了装配系统中各层智能体的实例开发。
Resumo:
根据小型自治遥控水下机器人SARV的运动特性,研制了光纤微缆收放的控制系统。设计使用了嵌入式QNX软件开发技术,系统稳定可靠。采用系统辨识的方法,获得被控对象的等效数学模型。采用单神经元自适应PID控制器对控制参数进行在线自调节,实现了SARV在水中运动时光纤收放的恒张力控制,满足光纤收放装置的设计要求。
Resumo:
深海机器人推进电机系统中出现的混沌现象,直接影响深海机器人稳定性、可靠性和安全性.采用自适应控制技术对其混沌行为加以控制,对该方法的可行性和有效性进行了证明.设计和构造了易于工程实施的混沌控制器,用于深海机器人推进电机系统混沌控制.仿真实验表明,推进电机系统在自适应控制器的作用下可迅速脱离混沌状态,并进入持续稳定状态,控制效果明显.可以为深海机器人推进电机系统中可能出现的混沌运行行为提供控制策略和抑制预案,有利于混沌控制嵌入软件的开发,确保深海机器人稳定、可靠和安全地运行,具有一定的实用价值.
Resumo:
Seismic While Drilling (SWD) is a new wellbore seismic technique. It uses the vibrations produced by a drill-bit while drilling as a downhole seismic energy source. The continuous signals generated by the drill bit are recorded by a pilot sensor attached to the top of the drill-string. Seismic wave receivers positioned in the earth near its surface receive the seismic waves both directly and reflection from the geologic formations. The pilot signal is cross-correlated with the receiver signals to compute travel-times of the arrivals (direct arrival and reflected arrival) and attenuate incoherent noise. No downhole intrusmentation is required to obtain the data and the data recording does not interfere with the drilling process. These characteristics offer a method by which borehole seismic data can be acquired, processed, and interpreted while drilling. As a Measure-While-Drill technique. SWD provides real-time seismic data for use at the well site . This can aid the engineer or driller by indicating the position of the drill-bit and providing a look at reflecting horizons yet to be encountered by the drill-bit. Furthermore, the ease with which surface receivers can be deployed makes multi-offset VSP economically feasible. First, this paper is theoretically studying drill-bit wavefield, interaction mode between drill-bit and formation below drill-bit , the new technique of modern signal process was applied to seismic data, the seismic body wave radiation pattern of a working roller-cone drill-bit can be characterized by theoretical modeling. Then , a systematical analysis about the drill-bit wave was done, time-distance equation of seismic wave traveling was established, the process of seismic while drilling was simulated using the computer software adaptive modeling of SWD was done . In order to spread this technique, I have made trial SWD modeling during drilling. the paper sketches out the procedure for trial SWD modeling during drilling , the involved instruments and their functions, and the trial effect. Subsurface condition ahead of the drill-bit can be predicted drillstring velocity was obtained by polit sensor autocorrelation. Reference decovolution, the drillstring multiples in the polit signal are removed by reference deconvolution, the crosscorrelation process enhance the signal-to-noise power ratio, lithologies. Final, SWD provides real-time seismic data for use at the well site well trajectory control exploratory well find out and preserve reservoirs. intervel velocity was computed by the traveltime The results of the interval velocity determination reflects the pore-pressure present in the subsurface units ahead of the drill-bit. the presences of fractures in subsurface formation was detected by shear wave. et al.
Resumo:
The modeling of petroleum flow path (petroleum charging) and the detail of corresponding software development are presented in this paper, containing principle of petroleum charging, quantitative method, and practical modeling in two oil fields. The Modeling of Petroleum Flow Path is based on the result of basin modeling, according to the principle of petroleum migrating along the shortest path from the source to trap, Petroleum System Dynamics (Prof. Wu Chonglong, 1998), the concept of Petroleum Migration and Dynamic Accumulation (Zhou Donyan, Li Honhui, 2002), etc. The simulation is done combing with all parameters of basin, and considering the flow potential, non-uniformity of source and porous layer. It's the extending of basin modeling, but not belong to it. It is a powerful simulating tool of petroleum system, and can express quantitatively every kind of geology elements of a petroleum basin, and can recuperate dynamically the geology processes with 3D graphics. At result, we can give a result that the petroleum flow shows itself the phenomena of main path, and without using the special theory such as deflection flow in fractures(Tian Kaiming, 1989, 1994, Zhang Fawang, Hou Xingwei, 1998), and flow potential(England, 1987). The contour map of petroleum flow quantitative show clearly where the coteau - dividing slot is, and which convergence region are the main flow path of petroleum, and where is the favorable play of petroleum. The farsighted trap can be determined if there are enough information about structural diagram and can be evaluated, such as the entrapment extent, spill point, area, oil column thickness, etc. Making full use of the result of basin modeling with this new tool, the critical moment and scheme of the petroleum generation and expulsion can be showed clearly. It's powerful analysis tool for geologist.
Resumo:
Study of 3D visualization technology of engineering geology and its application to engineering is a cross subject which includes geosciences, computer, software and information technology. Being an important part of the secondary theme of National Basic Research Program of China (973 Program) whose name is Study of Multi-Scale Structure and Occurrence Environment of Complicated Geological Engineering Mass(No.2002CB412701), the dissertation involves the studies of key problems of 3D geological modeling, integrated applications of multi-format geological data, effective modeling methods of complex approximately layered geological mass as well as applications of 3D virtual reality information management technology.The main research findings are listed below:Integrated application method of multi-format geological data is proposed,which has solved the integrated application of drill holes, engineering geology plandrawings, sectional drawings and cutting drawings as well as exploratory trenchsketch. Its application can provide as more as possible fundamental data for 3Dgeological modeling.A 3D surface construction method combined Laplace interpolation points withoriginal points is proposed, so the deformation of 3D model and the crossing error ofupper and lower surface of model resulted from lack of data when constructing alaminated stratum can be eliminated.3D modeling method of approximately layered geological mass is proposed,which has solved the problems of general modeling method based on the sections or points and faces when constructing terrain and concordant strata.The 3D geological model of VII dam site of Xiangjiaba hydropower stationhas been constructed. The applications of 3D geological model to the auto-plotting ofsectional drawing and the converting of numerical analysis model are also discussed.3D virtual reality information integrated platform is developed, whose mostimportant character is that it is a software platform having the functions of 3D virtualreality flying and multi-format data management simultaneously. Therefore, theplatform can load different 3D model so as to satisfy the different engineeringdemands.The relics of Aigong Cave of Longyou Stone Caves are recovered. Thereinforcement plans of 1# and 2# cave in phoenix hill also be expressed. The intuitiveexpression provided decision makers and designers a very good environment.The basic framework and specific functions of 3D geological informationsystem are proposed.The main research findings in the dissertation have been successfully applied to some important engineering such as Xiangjiaba hydropower station, a military airport and Longyou Stone Caves etc.
Resumo:
A maioria dos produtores de gado de corte não realiza adequadamente o controle de receitas e despesas de sua atividade. Como a lucratividade da pecuária de corte tem se reduzido pela queda no preço de seus produtos e pela alta nos custos de produção, tal função tornou-se crucial para aqueles que querem se manter no negócio. Para atender essa demanda, a Embrapa Gado de Corte desenvolveu, em ambiente Excel, o Controlpec 1.0, uma ferramenta simples e de fácil utilização pelos produtores. Com base no movimento financeiro da fazenda são gerados relatórios que consolidam despesas, receitas e margens econômicas da atividade, tendo em conta o plano de contas definido pelo usuário. Os resultados são expostos para cada mês do ano e para todo o ano.
Resumo:
The 1989 AI Lab Winter Olympics will take a slightly different twist from previous Olympiads. Although there will still be a dozen or so athletic competitions, the annual talent show finale will now be a display not of human talent, but of robot talent. Spurred on by the question, "Why aren't there more robots running around the AI Lab?", Olympic Robot Building is an attempt to teach everyone how to build a robot and get them started. Robot kits will be given out the last week of classes before the Christmas break and teams have until the Robot Talent Show, January 27th, to build a machine that intelligently connects perception to action. There is no constraint on what can be built; participants are free to pick their own problems and solution implementations. As Olympic Robot Building is purposefully a talent show, there is no particular obstacle course to be traversed or specific feat to be demonstrated. The hope is that this format will promote creativity, freedom and imagination. This manual provides a guide to overcoming all the practical problems in building things. What follows are tutorials on the components supplied in the kits: a microprocessor circuit "brain", a variety of sensors and motors, a mechanical building block system, a complete software development environment, some example robots and a few tips on debugging and prototyping. Parts given out in the kits can be used, ignored or supplemented, as the kits are designed primarily to overcome the intertia of getting started. If all goes well, then come February, there should be all kinds of new members running around the AI Lab!
Resumo:
Timing data is infrequently reported in aphasiological literature and time taken is only a minor factor, where it is considered at all, in existing aphasia assessments. This is not surprising because reaction times are difficult to obtain manually, but it is a pity, because speed data should be indispensable in assessing the severity of language processing disorders and in evaluating the effects of treatment. This paper argues that reporting accuracy data without discussing speed of performance gives an incomplete and potentially misleading picture of any cognitive function. Moreover, in deciding how to treat, when to continue treatment and when to cease therapy, clinicians should have regard to both parameters: Speed and accuracy of performance. Crerar, Ellis and Dean (1996) reported a study in which the written sentence comprehension of 14 long-term agrammatic subjects was assessed and treated using a computer-based microworld. Some statistically significant and durable treatment effects were obtained after a short amount of focused therapy. Only accuracy data were reported in that (already long) paper, and interestingly, although it has been a widely read study, neither referees nor subsequent readers seemed to miss "the other side of the coin": How these participants compared with controls for their speed of processing and what effect treatment had on speed. This paper considers both aspects of the data and presents a tentative way of combining treatment effects on both accuracy and speed of performance in a single indicator. Looking at rehabilitation this way gives us a rather different perspective on which individuals benefited most from the intervention. It also demonstrates that while some subjects are capable of utilising metalinguistic skills to achieve normal accuracy scores even many years post-stroke, there is little prospect of reducing the time taken to within the normal range. Without considering speed of processing, the extent of this residual functional impairment can be overlooked.
Resumo:
The appropriation of digital artefacts involves their use, which has changed, evolved or developed beyond their original design. Thus, to understand appropriation, we must understand use. We define use as the active, purposive exploitation of the affordances offered by the technology and from this perspective; appropriation emerges as a natural consequence of this enactive use. Enaction tells us that perception is an active process. It is something we do, and not something that happens to us. From this reading, use then becomes the active exploitation of the affordances offered us by the artefact, system or service. In turn, we define appropriation as the engagement with these actively disclosed affordances—disclosed as a consequence of, not just, seeing but of seeing as. We present a small case study that highlights instances of perception as an actively engaged skill. We conclude that appropriation is a simple consequence of enactive perception.
Resumo:
Web threats are becoming a major issue for both governments and companies. Generally, web threats increased as much as 600% during last year (WebSense, 2013). This appears to be a significant issue, since many major businesses seem to provide these services. Denial of Service (DoS) attacks are one of the most significant web threats and generally their aim is to waste the resources of the target machine (Mirkovic & Reiher, 2004). Dis-tributed Denial of Service (DDoS) attacks are typically executed from many sources and can result in large traf-fic flows. During last year 11% of DDoS attacks were over 60 Gbps (Prolexic, 2013a). The DDoS attacks are usually performed from the large botnets, which are networks of remotely controlled computers. There is an increasing effort by governments and companies to shut down the botnets (Dittrich, 2012), which has lead the attackers to look for alternative DDoS attack methods. One of the techniques to which attackers are returning to is DDoS amplification attacks. Amplification attacks use intermediate devices called amplifiers in order to amplify the attacker's traffic. This work outlines an evaluation tool and evaluates an amplification attack based on the Trivial File Transfer Proto-col (TFTP). This attack could have amplification factor of approximately 60, which rates highly alongside other researched amplification attacks. This could be a substantial issue globally, due to the fact this protocol is used in approximately 599,600 publicly open TFTP servers. Mitigation methods to this threat have also been consid-ered and a variety of countermeasures are proposed. Effects of this attack on both amplifier and target were analysed based on the proposed metrics. While it has been reported that the breaching of TFTP would be possible (Schultz, 2013), this paper provides a complete methodology for the setup of the attack, and its verification.
Resumo:
It is anticipated that constrained devices in the Internet of Things (IoT) will often operate in groups to achieve collective monitoring or management tasks. For sensitive and mission-critical sensing tasks, securing multicast applications is therefore highly desirable. To secure group communications, several group key management protocols have been introduced. However, the majority of the proposed solutions are not adapted to the IoT and its strong processing, storage, and energy constraints. In this context, we introduce a novel decentralized and batch-based group key management protocol to secure multicast communications. Our protocol is simple and it reduces the rekeying overhead triggered by membership changes in dynamic and mobile groups and guarantees both backward and forward secrecy. To assess our protocol, we conduct a detailed analysis with respect to its communcation and storage costs. This analysis is validated through simulation to highlight energy gains. The obtained results show that our protocol outperforms its peers with respect to keying overhead and the mobility of members.