923 resultados para Electronic digital computers


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A saturação de água é a principal propriedade petrofísica para a avaliação de reservatórios de hidrocarbonetos, pois através da análise dos seus valores é definida a destinação final do poço recém perfurado, como produtor ou poço seco. O cálculo da saturação de água para as formações limpas é, comumente, realizado a partir da equação de Archie, que envolve a determinação da resistividade da zona virgem, obtida a partir de um perfil de resistividade profunda e o cálculo de porosidade da rocha, obtida a partir dos perfis de porosidade. A equação de Archie envolve ainda, a determinação da resistividade da água de formação, que normalmente necessita de definição local e correção para a profundidade da formação e da adoção de valores convenientes para os coeficientes de Archie. Um dos métodos mais tradicionais da geofísica de poço para o cálculo da saturação de água é o método de Hingle, particularmente útil nas situações de desconhecimento da resistividade da água de formação. O método de Hingle estabelece uma forma linear para a equação de Archie, a partir dos perfis de resistividade e porosidade e a representa na forma gráfica, como a reta da água ou dos pontos, no gráfico de Hingle, com saturação de água unitária e o valor da resistividade da água de formação é obtido a partir da inclinação da reta da água. Independente do desenvolvimento tecnológico das ferramentas de perfilagem e dos computadores digitais, o geofísico, ainda hoje, se vê obrigado a realizar a interpretação de ábacos ou gráficos, sujeito a ocorrência de erros derivados da sua acuidade visual. Com o objetivo de mitigar a ocorrência deste tipo de erro e produzir uma primeira aproximação para a saturação de água em tempo real de perfilagem do poço, insere-se o trabalho apresentado nesta dissertação, com a utilização de uma conveniente arquitetura de rede neural artificial, a rede competitiva angular, capaz de identificar a localização da reta da água, a partir da identificação de padrões angulares presentes nos dados dos perfis de porosidade e resistividade representados no gráfico de Hingle. A avaliação desta metodologia é realizada sobre dados sintéticos, que satisfazem integralmente a equação de Archie, e sobre dados reais.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Organizations are expending more and more, the spaces electronic / digital (Internet / intranet / extranet) as a way to efficiently manage information and knowledge in the organizational environment. The management of inputs informational and intellectual belongings ranges from the strategic level to the operational level; the results demonstrate the strength of the socialization of organizational strategies. Objective: To reflect on the role of information architecture for the development of electronic spaces / digital in organizational environments. Methodology: Analytical study supported by specialized literature, based on three aspects emphasized by Morville and Rosenfeld (2006) and applied to information architecture: context, content and user studies beyond the search and use of information Choo (2006) which also highlights three aspects: situational dimensions, cognitive needs and emotional reactions. Results: In the context of the Web environment organizations have a large number of sites for brands / products that have mostly no organizational structure or shared navigation. The results show that when a department needs to contact another department must do so in order offline. Conclusion: The information architecture has become essential for the development of management information systems that makes possible to easily find and access data and information, as well as helps in developing distinct hierarchies to structure the distribution of content, promoting developed quality and effectiveness of the management systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Human activity attracting a lot of research activity in several fields including the use of wireless sensors, positioning technologies and techniques, embedded computing, remote sensing and energy management among others. There are a number of applications where the results of those investigations can be applied, including ambient intelligence to support human activity, particularly the elderly and disabled people. Ambient intelligence is a new paradigm for the information and communications technologies where the electronic/digital environment takes care of the people presence and their needs, becoming an active, adaptive and responsive environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Issued also as thesis, University of Illinois.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Supported by: Office of Naval Research under contract N000 14-67-A-0305-0007.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

NSF MCS 77-22830."

Relevância:

80.00% 80.00%

Publicador:

Resumo:

"Aeronautical Research Laboratory, Contract No. AF 33(616)-2797, Project 7060."

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the increasing use of digital computers for data acquisition and digital process control, frequency domain transducers have become very attractive due to their virtual digital output. Essentially they are electrically maintained oscillators where the sensor is the controlling resonator.They are designed to make the frequency a function of the physical parameter being measured. Because of their high quality factor, mechanical resonators give very good frequency stability and are widely used as sensors. For this work symmetrical mechanical resonators such as the tuning fork were considered, to be the most promising. These are dynamically clamped and can be designed to have extensive regions where no vibrations occur.This enables the resonators to be robustly mounted in a way convenient for various applications. Designs for the measurement of fluid density and tension have been produced. The principle of the design of the resonator for fluid density measurement is a thin gap (trapping a lamina of fluid) between its two members which vibrate in antiphase.An analysis of the inter­ action between this resonator and the fluid lamina has carried out.In gases narrow gaps are needed for a good sensitivity and the use of the material fused quartz, because of its low density and very low temperature coefficient, is ideally suitable. In liquids an adequate sensitivity is achieved even with a wide lamina gap. Practical designs of such transducers have been evolved. The accuracy for liquid measurements is better than 1%. For gases it was found that, in air, a change of atmospheric pressure of 0.3% could be detected. In constructing a tension transducer using such a mechanical sensor as a wire or a beam, major difficulties are encountered in making an efficient clamping arrangement for the sensor. The use of dynamically clamped beams has been found to overcome the problem and this is the basis of the transducer investigated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present dissertation is concerned with the determination of the magnetic field distribution in ma[.rnetic electron lenses by means of the finite element method. In the differential form of this method a Poisson type equation is solved by numerical methods over a finite boundary. Previous methods of adapting this procedure to the requirements of digital computers have restricted its use to computers of extremely large core size. It is shown that by reformulating the boundary conditions, a considerable reduction in core store can be achieved for a given accuracy of field distribution. The magnetic field distribution of a lens may also be calculated by the integral form of the finite element rnethod. This eliminates boundary problems mentioned but introduces other difficulties. After a careful analysis of both methods it has proved possible to combine the advantages of both in a .new approach to the problem which may be called the 'differential-integral' finite element method. The application of this method to the determination of the magnetic field distribution of some new types of magnetic lenses is described. In the course of the work considerable re-programming of standard programs was necessary in order to reduce the core store requirements to a minimum.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los avances científicos y tecnológicos en el mundo y su aplicación en las diferentes áreas del conocimiento han desarrollado una nueva era electrónica.Las computadoras y las telecomunicaciones han originado un nuevo escenario para las bibliotecas y centros de información que conduce a nuevos cambios en el uso de sus colecciones y en el incremento de sus servicios. Las bibliotecas y centros de información deben enfrentar estos cambios porque los usuarios demandan el acceso a la información en forma rápida y eficiente.El CD-ROM es una nueva tecnología que ha estado incursionando en las bibliotecas de manera muy efectiva. CD-ROM es una forma sofisticado de permitir el acceso a la información en cualquier campo; leyes, medicina, educación, agricultura, finanzas, ingeniería, etc

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Public key cryptography, and with it,the ability to compute digital signatures, have made it possible for electronic commerce to flourish. It is thus unsurprising that the proposed Australian NECS will also utilise digital signatures in its system so as to provide a fully automated process from the creation of electronic land title instrument to the digital signing, and electronic lodgment of these instruments. This necessitates an analysis of the fraud risks raised by the usage of digital signatures because a compromise of the integrity of digital signatures will lead to a compromise of the Torrens system itself. This article will show that digital signatures may in fact offer greater security against fraud than handwritten signatures; but to achieve this, digital signatures require an infrastructure whereby each component is properly implemented and managed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper examines three functions of music technology in the study of music. Firstly, as a tool, secondly, as an instrument and, lastly, as a medium for thinking. As our societies become increasingly embroiled in digital media for representation and communication, our philosophies of music education need to adapt to integrate these developments while maintaining the essence of music. The foundation of music technology in the 1990s is the digital representation of sound. It is this fundamental shift to a new medium with which to represent sound that carries with it the challenge to address digital technology and its multiple effects on music creation and presentation. In this paper I suggest that music institutions should take a broad and integrated approach to the place of music technology in their courses, based on the understanding of digital representation of sound and these three functions it can serve. Educators should reconsider digital technologies such as synthesizers and computers as music instruments and cognitive amplifiers, not simply as efficient tools.