923 resultados para Electronic digital computers.


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis discusses the need for nondestructive testing and highlights some of the limitations in present day techniques. Special interest has been given to ultrasonic examination techniques and the problems encountered when they are applied to thick welded plates. Some suggestions are given using signal processing methods. Chapter 2 treats the need for nondestructive testing as seen in the light of economy and safety. A short review of present day techniques in nondestructive testing is also given. The special problems using ultrasonic techniques for welded structures is discussed in Chapter 3 with some examples of elastic wave propagation in welded steel. The limitations in applying sophisticated signal processing techniques to ultrasonic NDT~ mainly found in the transducers generating or receiving the ultrasound. Chapter 4 deals with the different transducers used. One of the difficulties with ultrasonic testing is the interpretation of the signals encountered. Similar problems might be found with SONAR/RADAR techniques and Chapter 5 draws some analogies between SONAR/RADAR and ultrasonic nondestructive testing. This chapter also includes a discussion on some on the techniques used in signal processing in general. A special signal processing technique found useful is cross-correlation detection and this technique is treated in Chapter 6. Electronic digital compute.rs have made signal processing techniques easier to implement -Chapter 7 discusses the use of digital computers in ultrasonic NDT. Experimental equipment used to test cross-correlation detection of ultrasonic signals is described in Chapter 8. Chapter 9 summarises the conclusions drawn during this investigation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modular arithmetic has often been regarded as something of a mathematical curiosity, at least by those unfamiliar with its importance to both abstract algebra and number theory, and with its numerous applications. However, with the ubiquity of fast digital computers, and the need for reliable digital security systems such as RSA, this important branch of mathematics is now considered essential knowledge for many professionals. Indeed, computer arithmetic itself is, ipso facto, modular. This chapter describes how the modern graphical spreadsheet may be used to clearly illustrate the basics of modular arithmetic, and to solve certain classes of problems. Students may then gain structural insight and the foundations laid for applications to such areas as hashing, random number generation, and public-key cryptography.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we have first given a numerical procedure for the solution of second order non-linear ordinary differential equations of the type y″ = f (x;y, y′) with given initial conditions. The method is based on geometrical interpretation of the equation, which suggests a simple geometrical construction of the integral curve. We then translate this geometrical method to the numerical procedure adaptable to desk calculators and digital computers. We have studied the efficacy of this method with the help of an illustrative example with known exact solution. We have also compared it with Runge-Kutta method. We have then applied this method to a physical problem, namely, the study of the temperature distribution in a semi-infinite solid homogeneous medium for temperature-dependent conductivity coefficient.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advent of large and fast digital computers and development of numerical techniques suited to these have made it possible to review the analysis of important fundamental and practical problems and phenomena of engineering which have remained intractable for a long time. The understanding of the load transfer between pin and plate is one such. Inspite of continuous attack on these problems for over half a century, classical solutions have remained limited in their approach and value to the understanding of the phenomena and the generation of design data. On the other hand, the finite element methods that have grown simultaneously with the recent development of computers have been helpful in analysing specific problems and answering specific questions, but are yet to be harnessed to assist in obtaining with economy a clearer understanding of the phenomena of partial separation and contact, friction and slip, and fretting and fatigue in pin joints. Against this background, it is useful to explore the application of the classical simple differential equation methods with the aid of computer power to open up this very important area. In this paper we describe some of the recent and current work at the Indian Institute of Science in this last direction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Faz reflexões sobre as várias formas de relação entre sujeitos e objetos-técnicos, com ênfase para a utilização dos computadores digitais e, particularmente, os softwares chamados agentes inteligentes. Analisa o espaço e suas mudanças qualitativas na atualidade, a partir do conceito do espaço como produção humana, analisando como as transformações em curso no ambiente afetam nossas subjetividades e, reciprocamente, como afetamos nossos ambientes. Discutidas as possibilidades de sobrevivência do homem nu nesses novos espaços, sem que esteja devidamente atualizado com as últimas novidades tecnológicas - próteses sensoriais e motoras. Perpassa a discussão sobre o pensamento que se utiliza do espaço como elemento constituinte do próprio pensamento e reflete sobre o espaço abstrato por excelência, os mundos virtuais. Discute o padrão de apropriação de artefatos pelo homem e seus efeitos na subjetividade, a manutenção do padrão de apropriação dos objetos-técnicos materiais em relação às formas de apropriação dos objetos-técnicos intangíveis (softwares). Traz reflexões sobre a possibilidade de autonomização completa dos agentes inteligentes e a sua instituição, ipso facto, como agentes - a chamada Inteligência Artificial.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We introduce an in vitro diagnostic magnetic biosensing platform for immunoassay and nucleic acid detection. The platform has key characteristics for a point-of-use (POU) diagnostic: portability, low-power consumption, low cost, and multiplexing capability. As a demonstration of capabilities, we use this platform for the room temperature, amplification-free detection of a 31 bp DNA oligomer and interferon-gamma (a protein relevant for tuberculosis diagnosis). Reliable assay measurements down to 100 pM for the DNA and 1 pM for the protein are demonstrated. We introduce a novel "magnetic freezing" technique for baseline measurement elimination and to enable spatial multiplexing. We have created a general protocol for adapting integrated circuit (IC) sensors to any of hundreds of commercially available immunoassay kits and custom designed DNA sequences.

We also introduce a method for immunotherapy treatment of malignant gliomas. We utilize leukocytes internalized with immunostimulatory nanoparticle-oligonucleotide conjugates to localize and retain immune cells near the tumor site. As a proof-of-principle, we develop a novel cell imaging and incubation chamber for in vitro magnetic motility experiments. We use the apparatus to demonstrate the controlled movement of magnetically loaded THP-1 leukocytes.

Finally, we introduce an IC transmitter and power ampli er (PA) that utilizes electronic digital infrastructure, sensors, and actuators to self-heal and adapt to process, dynamic, and environmental variation. Traditional IC design has achieved incredible degrees of reliability by ensuring that billions of transistors on a single IC die are all simultaneously functional. Reliability becomes increasingly difficult as the size of a transistor shrinks. Self-healing can mitigate these variations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A study is made of the accuracy of electronic digital computer calculations of ground displacement and response spectra from strong-motion earthquake accelerograms. This involves an investigation of methods of the preparatory reduction of accelerograms into a form useful for the digital computation and of the accuracy of subsequent digital calculations. Various checks are made for both the ground displacement and response spectra results, and it is concluded that the main errors are those involved in digitizing the original record. Differences resulting from various investigators digitizing the same experimental record may become as large as 100% of the maximum computed ground displacements. The spread of the results of ground displacement calculations is greater than that of the response spectra calculations. Standardized methods of adjustment and calculation are recommended, to minimize such errors.

Studies are made of the spread of response spectral values about their mean. The distribution is investigated experimentally by Monte Carlo techniques using an electric analog system with white noise excitation, and histograms are presented indicating the dependence of the distribution on the damping and period of the structure. Approximate distributions are obtained analytically by confirming and extending existing results with accurate digital computer calculations. A comparison of the experimental and analytical approaches indicates good agreement for low damping values where the approximations are valid. A family of distribution curves to be used in conjunction with existing average spectra is presented. The combination of analog and digital computations used with Monte Carlo techniques is a promising approach to the statistical problems of earthquake engineering.

Methods of analysis of very small earthquake ground motion records obtained simultaneously at different sites are discussed. The advantages of Fourier spectrum analysis for certain types of studies and methods of calculation of Fourier spectra are presented. The digitizing and analysis of several earthquake records is described and checks are made of the dependence of results on digitizing procedure, earthquake duration and integration step length. Possible dangers of a direct ratio comparison of Fourier spectra curves are pointed out and the necessity for some type of smoothing procedure before comparison is established. A standard method of analysis for the study of comparative ground motion at different sites is recommended.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A textura é um atributo ainda pouco utilizado no reconhecimento automático de cenas naturais em sensoriamento remoto, já que ela advém da sensação visual causada pelas variações tonais existentes em uma determinada região da imagem, tornando difícil a sua quantificação. A morfologia matemática, através de operações como erosão, dilatação e abertura, permite decompor uma imagem em elementos fundamentais, as primitivas texturais. As primitivas texturais apresentam diversas dimensões, sendo possível associar um conjunto de primitivas com dimensões semelhantes, em uma determinada classe textural. O processo de classificação textural quantifica as primitivas texturais, extrai as distribuições das dimensões das mesmas e separa as diferentes distribuições por meio de um classificador de máxima-verossimilhança gaussiana. O resultado final é uma imagem temática na qual cada tema representa uma das texturas existentes na imagem original.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Body image is the figure of our bodies built in our minds and the degree of dissatisfaction is often associated with risk factors identified by anthropometric measures. The purpose of this descriptive study was to evaluate the risk factors associated to morphological and functional variables associate to the perception of auto-image in middle-aged walkers of the south zone of the city of Natal. A hundred and thirty volunteers had been evaluated in four groups in function of the gender and age group. As measurement evaluations were used an auto-image perception questionnaire proposed by Stunkart of nine silhouettes numbered for both gender was applied; a weighing machine equipped with stadiometer for the body mass (kg) and stature (m) and the body mass index (kg/m2) that was calculated with base in measures of the body weight and stature and classified according to norms of the National Institute of Health (2000) as well as the systolic and diastolic blood pressure by a electronic digital device (DIGITRONIC). A metal anthropometric tape was used for the waist to hip ratio (WHR). It was used Analyses of variance (ANOVA) one-way, post hoc of Tukey and correlation of Spearman for the nonparametric data adopting the level of ρ≤ 0,05 for rejection of the null hypothesis. The body mass index indicated high factors of risk in the consisting groups. In all the groups were registered the desire to reduce their silhouettes. The body weight shows reduced when compared with the younger group in the male group of superior age group, while in the female group the inverse one occurs. The autoimage perception is associated with the classification of the waist to hip ratio in the female gender in the age group of the 50 to the 59 years and in the classification of the body mass index of all constituted groups. Significant associations had not been found for classification of the systolic and diastolic blood pressure in relation to the auto-image 41 perception. This thesis presents relation of interdisciplinarity and its contents have application in the fields of Physical Education, Medicine, Physiotherapy and Nursing

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of non-linear controllers gained space in the theoretical ambit and of practical applications on the moment that the arising of digital computers enabled the implementation of these methodologies. In comparison with the linear controllers more utilized, the non -linear controllers present the advantage of not requiring the linearity of the system to determine the parameters of control, which permits a more efficient control especially when the system presents a high level of non-linearity. Another additional advantage is the reduction of costs, since to obtain the efficient control through linear controllers it is necessary the utilization of sensors and more refined actuators than when it is utilized a non-linear controller. Among the non-linear theories of control, the method of control by gliding ways is detached for being a method that presents more robustness, before uncertainties. It is already confirmed that the adoption of compensation on the region of residual error permits to improve better the performance of these controllers. So, in this work it is described the development of a non-linear controller that looks for an association of strategy of control by gliding ways, with the fuzzy compensation technique. Through the implementation of some strategies of fuzzy compensation, it was searched the one which provided the biggest efficiency before a system with high level of nonlinearities and uncertainties. The electrohydraulic actuator was utilized as an example of research, and the results appoint to two configurations of compensation that permit a bigger reduction of the residual error

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents an algorithm for the security control of electric power systems using control actions like generation reallocation, determined by sensitivity analysis (linearized model) and optimization by neural networks. The model is developed taking into account the dynamic network aspects. The preventive control methodology is developed by means of sensitivity analysis of the security margin related with the mechanical power of the system synchronous machines. The reallocation power in each machine is determined using neural networks. The neural network used in this work is of Hopfield type. These networks are dedicated electric circuits which simulate the constraint set and the objective function of an optimization problem. The advantage of using these networks is the higher speed in getting the solutions when compared to conventional optimization algorithms due to the great convergence rate of the process and the facility of the method parallelization. Then, the objectives are: formulate and investigate these networks implementations in determining. The generation reallocation in digital computers. Aiming to illustrate the proposed methodology an application considering a multi-machine system is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Energia na Agricultura) - FCA