915 resultados para Computer software -- Development -- Congresses


Relevância:

100.00% 100.00%

Publicador:

Resumo:

深海机器人推进电机系统中出现的混沌现象,直接影响深海机器人稳定性、可靠性和安全性.采用自适应控制技术对其混沌行为加以控制,对该方法的可行性和有效性进行了证明.设计和构造了易于工程实施的混沌控制器,用于深海机器人推进电机系统混沌控制.仿真实验表明,推进电机系统在自适应控制器的作用下可迅速脱离混沌状态,并进入持续稳定状态,控制效果明显.可以为深海机器人推进电机系统中可能出现的混沌运行行为提供控制策略和抑制预案,有利于混沌控制嵌入软件的开发,确保深海机器人稳定、可靠和安全地运行,具有一定的实用价值.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seismic While Drilling (SWD) is a new wellbore seismic technique. It uses the vibrations produced by a drill-bit while drilling as a downhole seismic energy source. The continuous signals generated by the drill bit are recorded by a pilot sensor attached to the top of the drill-string. Seismic wave receivers positioned in the earth near its surface receive the seismic waves both directly and reflection from the geologic formations. The pilot signal is cross-correlated with the receiver signals to compute travel-times of the arrivals (direct arrival and reflected arrival) and attenuate incoherent noise. No downhole intrusmentation is required to obtain the data and the data recording does not interfere with the drilling process. These characteristics offer a method by which borehole seismic data can be acquired, processed, and interpreted while drilling. As a Measure-While-Drill technique. SWD provides real-time seismic data for use at the well site . This can aid the engineer or driller by indicating the position of the drill-bit and providing a look at reflecting horizons yet to be encountered by the drill-bit. Furthermore, the ease with which surface receivers can be deployed makes multi-offset VSP economically feasible. First, this paper is theoretically studying drill-bit wavefield, interaction mode between drill-bit and formation below drill-bit , the new technique of modern signal process was applied to seismic data, the seismic body wave radiation pattern of a working roller-cone drill-bit can be characterized by theoretical modeling. Then , a systematical analysis about the drill-bit wave was done, time-distance equation of seismic wave traveling was established, the process of seismic while drilling was simulated using the computer software adaptive modeling of SWD was done . In order to spread this technique, I have made trial SWD modeling during drilling. the paper sketches out the procedure for trial SWD modeling during drilling , the involved instruments and their functions, and the trial effect. Subsurface condition ahead of the drill-bit can be predicted drillstring velocity was obtained by polit sensor autocorrelation. Reference decovolution, the drillstring multiples in the polit signal are removed by reference deconvolution, the crosscorrelation process enhance the signal-to-noise power ratio, lithologies. Final, SWD provides real-time seismic data for use at the well site well trajectory control exploratory well find out and preserve reservoirs. intervel velocity was computed by the traveltime The results of the interval velocity determination reflects the pore-pressure present in the subsurface units ahead of the drill-bit. the presences of fractures in subsurface formation was detected by shear wave. et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modeling of petroleum flow path (petroleum charging) and the detail of corresponding software development are presented in this paper, containing principle of petroleum charging, quantitative method, and practical modeling in two oil fields. The Modeling of Petroleum Flow Path is based on the result of basin modeling, according to the principle of petroleum migrating along the shortest path from the source to trap, Petroleum System Dynamics (Prof. Wu Chonglong, 1998), the concept of Petroleum Migration and Dynamic Accumulation (Zhou Donyan, Li Honhui, 2002), etc. The simulation is done combing with all parameters of basin, and considering the flow potential, non-uniformity of source and porous layer. It's the extending of basin modeling, but not belong to it. It is a powerful simulating tool of petroleum system, and can express quantitatively every kind of geology elements of a petroleum basin, and can recuperate dynamically the geology processes with 3D graphics. At result, we can give a result that the petroleum flow shows itself the phenomena of main path, and without using the special theory such as deflection flow in fractures(Tian Kaiming, 1989, 1994, Zhang Fawang, Hou Xingwei, 1998), and flow potential(England, 1987). The contour map of petroleum flow quantitative show clearly where the coteau - dividing slot is, and which convergence region are the main flow path of petroleum, and where is the favorable play of petroleum. The farsighted trap can be determined if there are enough information about structural diagram and can be evaluated, such as the entrapment extent, spill point, area, oil column thickness, etc. Making full use of the result of basin modeling with this new tool, the critical moment and scheme of the petroleum generation and expulsion can be showed clearly. It's powerful analysis tool for geologist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study of 3D visualization technology of engineering geology and its application to engineering is a cross subject which includes geosciences, computer, software and information technology. Being an important part of the secondary theme of National Basic Research Program of China (973 Program) whose name is Study of Multi-Scale Structure and Occurrence Environment of Complicated Geological Engineering Mass(No.2002CB412701), the dissertation involves the studies of key problems of 3D geological modeling, integrated applications of multi-format geological data, effective modeling methods of complex approximately layered geological mass as well as applications of 3D virtual reality information management technology.The main research findings are listed below:Integrated application method of multi-format geological data is proposed,which has solved the integrated application of drill holes, engineering geology plandrawings, sectional drawings and cutting drawings as well as exploratory trenchsketch. Its application can provide as more as possible fundamental data for 3Dgeological modeling.A 3D surface construction method combined Laplace interpolation points withoriginal points is proposed, so the deformation of 3D model and the crossing error ofupper and lower surface of model resulted from lack of data when constructing alaminated stratum can be eliminated.3D modeling method of approximately layered geological mass is proposed,which has solved the problems of general modeling method based on the sections or points and faces when constructing terrain and concordant strata.The 3D geological model of VII dam site of Xiangjiaba hydropower stationhas been constructed. The applications of 3D geological model to the auto-plotting ofsectional drawing and the converting of numerical analysis model are also discussed.3D virtual reality information integrated platform is developed, whose mostimportant character is that it is a software platform having the functions of 3D virtualreality flying and multi-format data management simultaneously. Therefore, theplatform can load different 3D model so as to satisfy the different engineeringdemands.The relics of Aigong Cave of Longyou Stone Caves are recovered. Thereinforcement plans of 1# and 2# cave in phoenix hill also be expressed. The intuitiveexpression provided decision makers and designers a very good environment.The basic framework and specific functions of 3D geological informationsystem are proposed.The main research findings in the dissertation have been successfully applied to some important engineering such as Xiangjiaba hydropower station, a military airport and Longyou Stone Caves etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A maioria dos produtores de gado de corte não realiza adequadamente o controle de receitas e despesas de sua atividade. Como a lucratividade da pecuária de corte tem se reduzido pela queda no preço de seus produtos e pela alta nos custos de produção, tal função tornou-se crucial para aqueles que querem se manter no negócio. Para atender essa demanda, a Embrapa Gado de Corte desenvolveu, em ambiente Excel, o Controlpec 1.0, uma ferramenta simples e de fácil utilização pelos produtores. Com base no movimento financeiro da fazenda são gerados relatórios que consolidam despesas, receitas e margens econômicas da atividade, tendo em conta o plano de contas definido pelo usuário. Os resultados são expostos para cada mês do ano e para todo o ano.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 1989 AI Lab Winter Olympics will take a slightly different twist from previous Olympiads. Although there will still be a dozen or so athletic competitions, the annual talent show finale will now be a display not of human talent, but of robot talent. Spurred on by the question, "Why aren't there more robots running around the AI Lab?", Olympic Robot Building is an attempt to teach everyone how to build a robot and get them started. Robot kits will be given out the last week of classes before the Christmas break and teams have until the Robot Talent Show, January 27th, to build a machine that intelligently connects perception to action. There is no constraint on what can be built; participants are free to pick their own problems and solution implementations. As Olympic Robot Building is purposefully a talent show, there is no particular obstacle course to be traversed or specific feat to be demonstrated. The hope is that this format will promote creativity, freedom and imagination. This manual provides a guide to overcoming all the practical problems in building things. What follows are tutorials on the components supplied in the kits: a microprocessor circuit "brain", a variety of sensors and motors, a mechanical building block system, a complete software development environment, some example robots and a few tips on debugging and prototyping. Parts given out in the kits can be used, ignored or supplemented, as the kits are designed primarily to overcome the intertia of getting started. If all goes well, then come February, there should be all kinds of new members running around the AI Lab!

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Timing data is infrequently reported in aphasiological literature and time taken is only a minor factor, where it is considered at all, in existing aphasia assessments. This is not surprising because reaction times are difficult to obtain manually, but it is a pity, because speed data should be indispensable in assessing the severity of language processing disorders and in evaluating the effects of treatment. This paper argues that reporting accuracy data without discussing speed of performance gives an incomplete and potentially misleading picture of any cognitive function. Moreover, in deciding how to treat, when to continue treatment and when to cease therapy, clinicians should have regard to both parameters: Speed and accuracy of performance. Crerar, Ellis and Dean (1996) reported a study in which the written sentence comprehension of 14 long-term agrammatic subjects was assessed and treated using a computer-based microworld. Some statistically significant and durable treatment effects were obtained after a short amount of focused therapy. Only accuracy data were reported in that (already long) paper, and interestingly, although it has been a widely read study, neither referees nor subsequent readers seemed to miss "the other side of the coin": How these participants compared with controls for their speed of processing and what effect treatment had on speed. This paper considers both aspects of the data and presents a tentative way of combining treatment effects on both accuracy and speed of performance in a single indicator. Looking at rehabilitation this way gives us a rather different perspective on which individuals benefited most from the intervention. It also demonstrates that while some subjects are capable of utilising metalinguistic skills to achieve normal accuracy scores even many years post-stroke, there is little prospect of reducing the time taken to within the normal range. Without considering speed of processing, the extent of this residual functional impairment can be overlooked.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The appropriation of digital artefacts involves their use, which has changed, evolved or developed beyond their original design. Thus, to understand appropriation, we must understand use. We define use as the active, purposive exploitation of the affordances offered by the technology and from this perspective; appropriation emerges as a natural consequence of this enactive use. Enaction tells us that perception is an active process. It is something we do, and not something that happens to us. From this reading, use then becomes the active exploitation of the affordances offered us by the artefact, system or service. In turn, we define appropriation as the engagement with these actively disclosed affordances—disclosed as a consequence of, not just, seeing but of seeing as. We present a small case study that highlights instances of perception as an actively engaged skill. We conclude that appropriation is a simple consequence of enactive perception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web threats are becoming a major issue for both governments and companies. Generally, web threats increased as much as 600% during last year (WebSense, 2013). This appears to be a significant issue, since many major businesses seem to provide these services. Denial of Service (DoS) attacks are one of the most significant web threats and generally their aim is to waste the resources of the target machine (Mirkovic & Reiher, 2004). Dis-tributed Denial of Service (DDoS) attacks are typically executed from many sources and can result in large traf-fic flows. During last year 11% of DDoS attacks were over 60 Gbps (Prolexic, 2013a). The DDoS attacks are usually performed from the large botnets, which are networks of remotely controlled computers. There is an increasing effort by governments and companies to shut down the botnets (Dittrich, 2012), which has lead the attackers to look for alternative DDoS attack methods. One of the techniques to which attackers are returning to is DDoS amplification attacks. Amplification attacks use intermediate devices called amplifiers in order to amplify the attacker's traffic. This work outlines an evaluation tool and evaluates an amplification attack based on the Trivial File Transfer Proto-col (TFTP). This attack could have amplification factor of approximately 60, which rates highly alongside other researched amplification attacks. This could be a substantial issue globally, due to the fact this protocol is used in approximately 599,600 publicly open TFTP servers. Mitigation methods to this threat have also been consid-ered and a variety of countermeasures are proposed. Effects of this attack on both amplifier and target were analysed based on the proposed metrics. While it has been reported that the breaching of TFTP would be possible (Schultz, 2013), this paper provides a complete methodology for the setup of the attack, and its verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is anticipated that constrained devices in the Internet of Things (IoT) will often operate in groups to achieve collective monitoring or management tasks. For sensitive and mission-critical sensing tasks, securing multicast applications is therefore highly desirable. To secure group communications, several group key management protocols have been introduced. However, the majority of the proposed solutions are not adapted to the IoT and its strong processing, storage, and energy constraints. In this context, we introduce a novel decentralized and batch-based group key management protocol to secure multicast communications. Our protocol is simple and it reduces the rekeying overhead triggered by membership changes in dynamic and mobile groups and guarantees both backward and forward secrecy. To assess our protocol, we conduct a detailed analysis with respect to its communcation and storage costs. This analysis is validated through simulation to highlight energy gains. The obtained results show that our protocol outperforms its peers with respect to keying overhead and the mobility of members.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This document describes two sets of Benchmark Problem Instances for the One Dimensional Bin Packing Problem. The problem instances are supplied as compressed (zipped) SQLITE database files.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This document describes two sets of benchmark problem instances for the job shop scheduling problem. Each set of instances is supplied as a compressed (zipped) archive containing a single CSV file for each problem instance using the format described in http://rollproject.org/jssp/jsspGen.pdf

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This document describes a large set of Benchmark Problem Instances for the Rich Vehicle Routing Problem. All files are supplied as a single compressed (zipped) archive containing the instances, in XML format, an Object-Oriented Model supplied in XSD format, documentation and an XML parser written in Java to ease use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jasimuddin, Sajjad, 'Exploring knowledge transfer mechanisms: The case of a UK-based group within a high-tech global corporation', International Journal of Information Management (2007) 27(4) pp.294-300 RAE2008

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mathematical model to simulate the population dynamics and productivity of macroalgae is described. The model calculates the biomass variation of a population divided into size-classes. Biomass variation in each class is estimated from the mass balance of carbon fixation, carbon release and demographic processes such as mortality and frond breakage. The transitions between the different classes are calculated in biomass and density units as a function of algal growth. Growth is computed from biomass variations using an allometric relationship between weight and length. Gross and net primary productivity is calculated from biomass production and losses over the period of simulation. The model allows the simulation of different harvesting strategies of commercially important species. The cutting size and harvesting period may be changed in order to optimise the calculated yields. The model was used with the agarophyte Gelidium sesquipedale (Clem.) Born. et Thur. This species was chosen because of its economic importance as a the main raw material for the agar industry. Net primary productivity calculated with it and from biomass variations over a yearly period, gave similar results. The results obtained suggest that biomass dynamics and productivity are more sensitive to the light extinction coefficient than to the initial biomass conditions for the model. Model results also suggest that biomass losses due to respiration and exudation are comparable to those resulting from mortality and frond breakage. During winter, a significant part of the simulated population has a negative net productivity. The importance of considering different parameters in the productivity light relationships in order to account for their seasonal variability is demonstrated with the model results. The model was implemented following an object oriented programming approach. The programming methodology allows a fast adaptation of the model to other species without major software development.