962 resultados para internet computing
Resumo:
本文分析了Internet网络时间延迟的主要组成及其基本特性,介绍了网络时延测试实验的结果及其结论,以及当前存在的典型遥操作控制方法,对这些方法在基于Internet的机器人控制中的适用性进行了探讨,进一步给出了基于Internet的机器人遥操作系统的信息流程图,设计了以移动机器人为对象的基于Internet的机器人控制系统和时延补偿方法。
Resumo:
本文介绍了一种面向Internet的机器人遥操作系统实验平台,着重阐述了在平台的设计和实现过程中对数据通信问题的分析和解决。
Resumo:
简述了 Lotus Notes软件及其数据库特性 ,介绍了 Notes数据库构建和在 Internet/Intranet环境下提供全文检索的途径和方式。
Resumo:
With the large developments of the seismic sources theory, computing technologies and survey instruments, we can model and rebuild the rupture process of earthquakes more realistically. On which earthquake sources' properties and tectonic activities law are realized more clearly. The researches in this domain have been done in this paper as follows. Based on the generalized ray method, expressions for displacement on the surface of a half-space due to an arbitrary oriented shear and tensile dislocation are also obtained. Kinematically, fault-normal motion is equivalent to tensile faulting. There is some evidence that such motion occurs in many earthquakes. The expressions for static displacements on the surface of a layered half-space due to static point moment tensor source are given in terms of the generalized reflection and transmission coefficient matrix method. The validity and precision of the new method is illustrated by comparing the consistency of our results with the analytical solution given by Okada's code employing same point source and homogenous half-space model. The computed vertical ground displacement using the moment tensor solution of the Lanchang_Gengma earthquake displays considerable difference with that of a double couple component .The effect of a soft layer at the top of the homogenous half-space on a shallow normal-faulting earthquake is also analyzed. Our results show that more seismic information would be obtained utilizing seismic moment tensor source and layered half-space model. The rupture process of 1999 Chi-Chi, Taiwan, earthquake investigated by using co-seismic surface displacement GPS observations and far field P-wave records. In according to the tectonic analysis and distributions of aftershock, we introduce a three-segment bending fault planes into our model. Both elastic half-space models and layered-earth models to invert the distribution of co-seismic slip along the Chi-Chi earthquake rupture. The results indicate that the shear slip model can not fit horizontal and vertical co-seismic displacements together, unless we add the fault-normal motion (tensile component) in inversions. And then, the Chi Chi earthquake rupture process was obtained by inversion using the seismograms and GPS observations. Fault normal motions determined by inversion, concentrate on the shallow northern bending fault from Fengyuan to Shuangji where the surface earthquake ruptures reveal more complexity and the developed flexural slip folding structures than the other portions of the rupture zone For understanding the perturbation of surface displacements caused by near-surface complex structures, We have taken a numeric test to synthesize and inverse the surface displacements for a pop-up structure that is composed of a main thrust and a back thrust. Our result indicates that the pop-up structure, the typical shallow complex rupture that occurred in the northern bending fault zone form Fengyuan to Shuangji, can be modeled better by a thrust fault added negative tensile component than by a simple thrust fault. We interpret the negative tensile distributions, that concentrate on the shallow northern bending fault from Fengyuan to Shuangji, as a the synthetic effect including the complexities of property and geometry of rupture. The earthquake rupture process also reveal the more spatial and temporal complexities form Fenyuan to SHuangji. According to the three-components teleseismic records, the S-wave velocity structure beneath the 59 teleseismic stations of Taiwan obtained by using the transform function method and the SA techniques. The integrated results, the 3D crustal structure of Taiwan reveal that the thickest part of crustal local in the western Central Range. This conclusion is consistent with the result form the Bouguer gravity anomaly. The orogenic evolution of Taiwan is young period, and the developing foot of Central Range dose not in static balancing. The crustal of Taiwan stays in the course of dynamic equilibrium. The rupture process of 2003)2,24,Jiashi, Xinjiang earthquake was estimated by the finite fault model using far field broadband P wave records of CDSN and IRIS. The results indicate that the earthquake focal is north dip trust fault including some left-lateral strike slip. The focal mechanism of this earthquake is different form that of earthquakes occurred in 1997 and 1998, but similar to that of 1996, Artux, Xinjiang earthquake. We interpreted that the earthquake caused trust fault due to the Tarim basin pushing northward and orogeny of Tianshan mountain. In the end, give a brief of future research subject: Building the Real Time Distribute System for rupture process of Large Earthquakes Based on Internet.
Resumo:
2000
Resumo:
A Embrapa Monitoramento por Satélite, desde 1991, adaptou-se a esse novo cenário tecnológico e vem aprimorando, sempre que possível, seus recursos físicos e humanos a essa realidade. Justamente por estar acompanhando e aplicando essa tecnologia desde seu início, a Unidade possui um histórico de acontecimentos e desenvolvimento de infraestrutura, linguagens computacionais e metodologias que refletem exatamente a rapidez com a qual a Internet vem se desenvolvendo e a amplitude de aplicações que vem oferecendo. Este histórico é apresentado no trabalho, com o intuito de resgate e registro da experiência da equipe do Centro, nesta área específica do conhecimento humano, avaliando as transformações e adaptações de tecnologias e linguagens e sua importância como instrumento de comunicação direta com a sociedade.
Resumo:
Combining numerical techniques with ideas from symbolic computation and with methods incorporating knowledge of science and mathematics leads to a new category of intelligent computational tools for scientists and engineers. These tools autonomously prepare simulation experiments from high-level specifications of physical models. For computationally intensive experiments, they automatically design special-purpose numerical engines optimized to perform the necessary computations. They actively monitor numerical and physical experiments. They interpret experimental data and formulate numerical results in qualitative terms. They enable their human users to control computational experiments in terms of high-level behavioral descriptions.
Resumo:
The future of the software industry is today being shaped in the courtroom. Most discussions of intellectual property to date, however, have been frames as debates about how the existing law --- promulgated long before the computer revolution --- should be applied to software. This memo is a transcript of a panel discussion on what forms of legal protection should apply to software to best serve both the industry and society in general. After addressing that question we can consider what laws would bring this about.
Resumo:
We describe the key role played by partial evaluation in the Supercomputer Toolkit, a parallel computing system for scientific applications that effectively exploits the vast amount of parallelism exposed by partial evaluation. The Supercomputer Toolkit parallel processor and its associated partial evaluation-based compiler have been used extensively by scientists at M.I.T., and have made possible recent results in astrophysics showing that the motion of the planets in our solar system is chaotically unstable.
Resumo:
We present techniques for computing upper and lower bounds on the likelihoods of partial instantiations of variables in sigmoid and noisy-OR networks. The bounds determine confidence intervals for the desired likelihoods and become useful when the size of the network (or clique size) precludes exact computations. We illustrate the tightness of the obtained bounds by numerical experiments.
Resumo:
Location is a primary cue in many context-aware computing systems, and is often represented as a global coordinate, room number, or Euclidean distance various landmarks. A user?s concept of location, however, is often defined in terms of regions in which common activities occur. We show how to partition a space into such regions based on patterns of observed user location and motion. These regions, which we call activity zones, represent regions of similar user activity, and can be used to trigger application actions, retrieve information based on previous context, and present information to users. We suggest that context-aware applications can benefit from a location representation learned from observing users. We describe an implementation of our system and present two example applications whose behavior is controlled by users? entry, exit, and presence in the zones.
Resumo:
The dream of pervasive computing is slowly becoming a reality. A number of projects around the world are constantly contributing ideas and solutions that are bound to change the way we interact with our environments and with one another. An essential component of the future is a software infrastructure that is capable of supporting interactions on scales ranging from a single physical space to intercontinental collaborations. Such infrastructure must help applications adapt to very diverse environments and must protect people's privacy and respect their personal preferences. In this paper we indicate a number of limitations present in the software infrastructures proposed so far (including our previous work). We then describe the framework for building an infrastructure that satisfies the abovementioned criteria. This framework hinges on the concepts of delegation, arbitration and high-level service discovery. Components of our own implementation of such an infrastructure are presented.
Resumo:
I have invented "Internet Fish," a novel class of resource-discovery tools designed to help users extract useful information from the Internet. Internet Fish (IFish) are semi-autonomous, persistent information brokers; users deploy individual IFish to gather and refine information related to a particular topic. An IFish will initiate research, continue to discover new sources of information, and keep tabs on new developments in that topic. As part of the information-gathering process the user interacts with his IFish to find out what it has learned, answer questions it has posed, and make suggestions for guidance. Internet Fish differ from other Internet resource discovery systems in that they are persistent, personal and dynamic. As part of the information-gathering process IFish conduct extended, long-term conversations with users as they explore. They incorporate deep structural knowledge of the organization and services of the net, and are also capable of on-the-fly reconfiguration, modification and expansion. Human users may dynamically change the IFish in response to changes in the environment, or IFish may initiate such changes itself. IFish maintain internal state, including models of its own structure, behavior, information environment and its user; these models permit an IFish to perform meta-level reasoning about its own structure. To facilitate rapid assembly of particular IFish I have created the Internet Fish Construction Kit. This system provides enabling technology for the entire class of Internet Fish tools; it facilitates both creation of new IFish as well as additions of new capabilities to existing ones. The Construction Kit includes a collection of encapsulated heuristic knowledge modules that may be combined in mix-and-match fashion to create a particular IFish; interfaces to new services written with the Construction Kit may be immediately added to "live" IFish. Using the Construction Kit I have created a demonstration IFish specialized for finding World-Wide Web documents related to a given group of documents. This "Finder" IFish includes heuristics that describe how to interact with the Web in general, explain how to take advantage of various public indexes and classification schemes, and provide a method for discovering similarity relationships among documents.
Resumo:
This thesis examines a complete design framework for a real-time, autonomous system with specialized VLSI hardware for computing 3-D camera motion. In the proposed architecture, the first step is to determine point correspondences between two images. Two processors, a CCD array edge detector and a mixed analog/digital binary block correlator, are proposed for this task. The report is divided into three parts. Part I covers the algorithmic analysis; part II describes the design and test of a 32$\time $32 CCD edge detector fabricated through MOSIS; and part III compares the design of the mixed analog/digital correlator to a fully digital implementation.
Resumo:
Studying chaotic behavior in nonlinear systems requires numerous computations in order to simulate the behavior of such systems. The Standard Map Machine was designed and implemented as a special computer for performing these intensive computations with high-speed and high-precision. Its impressive performance is due to its simple architecture specialized to the numerical computations required of nonlinear systems. This report discusses the design and implementation of the Standard Map Machine and its use in the study of nonlinear mappings; in particular, the study of the standard map.