908 resultados para Pascal (Computer program language)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metallic glasses are of interest because of their mechanical properties. They are ductile as well as brittle. This is true of Pd77.5Cu6Si16.5, a ternary glassy alloy. Actually, the most stable metallic glasses are those which are alloys of noble or transition metals A general formula is postulated as T70–80G30-20where T stands for one or several 3d transition elements, and includes the metalloid glass formers. Another general formula is A3B to A5B where B is a metalloid. A computer method utilising the MIGAP computer program of Kaufman is used to calculate the miscibility gap over a range of temperatures. The precipitation of a secondary crystalline phase is postulated around 1500K. This could produce a dispersed phase composite with interesting high temperature-strength properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A user friendly interactive computer program, CIRDIC, is developed which calculates the molar ellipticity and molar circular dichroic absorption coefficients from the CD spectrum. This, in combination with LOTUS 1-2-3 spread sheet, will give the spectra of above parameters vs wavelength. The code is implemented in MicroSoft FORTRAN 77 which runs on any IBM compatible PC under MSDOS environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is a case study of applying nonparametric statistical methods to corpus data. We show how to use ideas from permutation testing to answer linguistic questions related to morphological productivity and type richness. In particular, we study the use of the suffixes -ity and -ness in the 17th-century part of the Corpus of Early English Correspondence within the framework of historical sociolinguistics. Our hypothesis is that the productivity of -ity, as measured by type counts, is significantly low in letters written by women. To test such hypotheses, and to facilitate exploratory data analysis, we take the approach of computing accumulation curves for types and hapax legomena. We have developed an open source computer program which uses Monte Carlo sampling to compute the upper and lower bounds of these curves for one or more levels of statistical significance. By comparing the type accumulation from women’s letters with the bounds, we are able to confirm our hypothesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is a computerized medical imaging technique which reconstructs the electrical impedance images of a domain under test from the boundary voltage-current data measured by an EIT electronic instrumentation using an image reconstruction algorithm. Being a computed tomography technique, EIT injects a constant current to the patient's body through the surface electrodes surrounding the domain to be imaged (Omega) and tries to calculate the spatial distribution of electrical conductivity or resistivity of the closed conducting domain using the potentials developed at the domain boundary (partial derivative Omega). Practical phantoms are essentially required to study, test and calibrate a medical EIT system for certifying the system before applying it on patients for diagnostic imaging. Therefore, the EIT phantoms are essentially required to generate boundary data for studying and assessing the instrumentation and inverse solvers a in EIT. For proper assessment of an inverse solver of a 2D EIT system, a perfect 2D practical phantom is required. As the practical phantoms are the assemblies of the objects with 3D geometries, the developing of a practical 2D-phantom is a great challenge and therefore, the boundary data generated from the practical phantoms with 3D geometry are found inappropriate for assessing a 2D inverse solver. Furthermore, the boundary data errors contributed by the instrumentation are also difficult to separate from the errors developed by the 3D phantoms. Hence, the errorless boundary data are found essential to assess the inverse solver in 2D EIT. In this direction, a MatLAB-based Virtual Phantom for 2D EIT (MatVP2DEIT) is developed to generate accurate boundary data for assessing the 2D-EIT inverse solvers and the image reconstruction accuracy. MatVP2DEIT is a MatLAB-based computer program which simulates a phantom in computer and generates the boundary potential data as the outputs by using the combinations of different phantom parameters as the inputs to the program. Phantom diameter, inhomogeneity geometry (shape, size and position), number of inhomogeneities, applied current magnitude, background resistivity, inhomogeneity resistivity all are set as the phantom variables which are provided as the input parameters to the MatVP2DEIT for simulating different phantom configurations. A constant current injection is simulated at the phantom boundary with different current injection protocols and boundary potential data are calculated. Boundary data sets are generated with different phantom configurations obtained with the different combinations of the phantom variables and the resistivity images are reconstructed using EIDORS. Boundary data of the virtual phantoms, containing inhomogeneities with complex geometries, are also generated for different current injection patterns using MatVP2DEIT and the resistivity imaging is studied. The effect of regularization method on the image reconstruction is also studied with the data generated by MatVP2DEIT. Resistivity images are evaluated by studying the resistivity parameters and contrast parameters estimated from the elemental resistivity profiles of the reconstructed phantom domain. Results show that the MatVP2DEIT generates accurate boundary data for different types of single or multiple objects which are efficient and accurate enough to reconstruct the resistivity images in EIDORS. The spatial resolution studies show that, the resistivity imaging conducted with the boundary data generated by MatVP2DEIT with 2048 elements, can reconstruct two circular inhomogeneities placed with a minimum distance (boundary to boundary) of 2 mm. It is also observed that, in MatVP2DEIT with 2048 elements, the boundary data generated for a phantom with a circular inhomogeneity of a diameter less than 7% of that of the phantom domain can produce resistivity images in EIDORS with a 1968 element mesh. Results also show that the MatVP2DEIT accurately generates the boundary data for neighbouring, opposite reference and trigonometric current patterns which are very suitable for resistivity reconstruction studies. MatVP2DEIT generated data are also found suitable for studying the effect of the different regularization methods on reconstruction process. Comparing the reconstructed image with an original geometry made in MatVP2DEIT, it would be easier to study the resistivity imaging procedures as well as the inverse solver performance. Using the proposed MatVP2DEIT software with modified domains, the cross sectional anatomy of a number of body parts can be simulated in PC and the impedance image reconstruction of human anatomy can be studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 0.2% experimental accuracy of the 1968 Beers and Hughes measurement of the annihilation lifetime of ortho-positronium motivates the attempt to compute the first order quantum electrodynamic corrections to this lifetime. The theoretical problems arising in this computation are here studied in detail up to the point of preparing the necessary computer programs and using them to carry out some of the less demanding steps -- but the computation has not yet been completed. Analytic evaluation of the contributing Feynman diagrams is superior to numerical evaluation, and for this process can be carried out with the aid of the Reduce algebra manipulation computer program.

The relation of the positronium decay rate to the electronpositron annihilation-in-flight amplitude is derived in detail, and it is shown that at threshold annihilation-in-flight, Coulomb divergences appear while infrared divergences vanish. The threshold Coulomb divergences in the amplitude cancel against like divergences in the modulating continuum wave function.

Using the lowest order diagrams of electron-positron annihilation into three photons as a test case, various pitfalls of computer algebraic manipulation are discussed along with ways of avoiding them. The computer manipulation of artificial polynomial expressions is preferable to the direct treatment of rational expressions, even though redundant variables may have to be introduced.

Special properties of the contributing Feynman diagrams are discussed, including the need to restore gauge invariance to the sum of the virtual photon-photon scattering box diagrams by means of a finite subtraction.

A systematic approach to the Feynman-Brown method of Decomposition of single loop diagram integrals with spin-related tensor numerators is developed in detail. This approach allows the Feynman-Brown method to be straightforwardly programmed in the Reduce algebra manipulation language.

The fundamental integrals needed in the wake of the application of the Feynman-Brown decomposition are exhibited and the methods which were used to evaluate them -- primarily dis persion techniques are briefly discussed.

Finally, it is pointed out that while the techniques discussed have permitted the computation of a fair number of the simpler integrals and diagrams contributing to the first order correction of the ortho-positronium annihilation rate, further progress with the more complicated diagrams and with the evaluation of traces is heavily contingent on obtaining access to adequate computer time and core capacity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The European mink (Mustela lutreola, L. 1761) is a critically endangered mustelid, which inhabits several main river drainages in Europe. Here, we assess the genetic variation of existing populations of this species, including new sampling sites and additional molecular markers (newly developed microsatellite loci specific to European mink) as compared to previous studies. Probabilistic analyses were used to examine genetic structure within and between existing populations, and to infer phylogeographic processes and past demography. Results: According to both mitochondrial and nuclear microsatellite markers, Northeastern (Russia, Estonia and Belarus) and Southeastern (Romania) European populations showed the highest intraspecific diversity. In contrast, Western European (France and Spain) populations were the least polymorphic, featuring a unique mitochondrial DNA haplotype. The high differentiation values detected between Eastern and Western European populations could be the result of genetic drift in the latter due to population isolation and reduction. Genetic differences among populations were further supported by Bayesian clustering and two main groups were confirmed (Eastern vs. Western Europe) along with two contained subgroups at a more local scale (Northeastern vs. Southeastern Europe; France vs. Spain). Conclusions: Genetic data and performed analyses support a historical scenario of stable European mink populations, not affected by Quaternary climate oscillations in the Late Pleistocene, and posterior expansion events following river connections in both North-and Southeastern European populations. This suggests an eastern refuge during glacial maxima (as already proposed for boreal and continental species). In contrast, Western Europe was colonised more recently following either natural expansions or putative human introductions. Low levels of genetic diversity observed within each studied population suggest recent bottleneck events and stress the urgent need for conservation measures to counteract the demographic decline experienced by the European mink.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A computer program was developed for the identification of the teleost fish eggs that may be found in the pelagic zone of the Black Sea. The program identifies eggs of 70 species, using up to 28 descriptive characters, and may be adapted for use outside of the Black Sea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Projetos de reatores nucleares foram classificados em quatro gerações (Gen) pelo Departamento de Energia dos Estados Unidos da América (DOE), quando o DOE introduziu o conceito de reatores de geração IV (Gen IV). Reatores Gen IV são um conjunto de projetos de reator nuclear, em sua maioria teóricos, atualmente sendo pesquisados. Entre os projetos Gen IV, incluem-se os projetos dos ADS (Accelerator Driven Systems), que são sistemas subcríticos estabilizados por fontes externas estacionárias de nêutrons. Estas fontes externas de nêutrons são normalmente geradas a partir da colisão de prótons com alta energia contra os núcleos de metais pesados presentes no núcleo do reator, fenômeno que é conhecido na literatura como spallation, e os prótons são acelerados num acelerador de partículas que é alimentado com parte da energia gerada pelo reator. A criticalidade de um sistema mantido por reações de fissão em cadeia depende do balanço entre a produção de nêutrons por fissão e a remoção por fuga pelos contornos e absorção de nêutrons. Um sistema está subcrítico quando a remoção por fuga e absorção ultrapassa a produção por fissão e, portanto, tende ao desligamento. Entretanto, qualquer sistema subcrítico pode ser estabilizado pela inclusão de fontes estacionárias de nêutrons em seu interior. O objetivo central deste trabalho é determinar as intensidades dessas fontes uniformes e isotrópicas de nêutrons, que se deve inserir em todas as regiões combustíveis do sistema, para que o mesmo estabilize-se gerando uma distribuição prescrita de potência elétrica. Diante do exposto, foi desenvolvido neste trabalho um aplicativo computacional em linguagem Java que estima as intensidades dessas fontes estacionárias de nêutrons, que devem ser inseridas em cada região combustível para que estabilizem o sistema subcrítico com uma dada distribuição de potência definida pelo usuário. Para atingir este objetivo, o modelo matemático adotado foi a equação unidimensional de transporte de nêutrons monoenergéticos na formulação de ordenadas discretas (SN) e o convencional método de malha fina diamond difference (DD) foi utilizado para resolver numericamente os problemas SN físicos e adjuntos. Resultados numéricos para dois problemas-modelos típicos são apresentados para ilustrar a acurácia e eficiência da metodologia proposta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates a method of automatic pronunciation scoring for use in computer-assisted language learning (CALL) systems. The method utilizes a likelihood-based `Goodness of Pronunciation' (GOP) measure which is extended to include individual thresholds for each phone based on both averaged native confidence scores and on rejection statistics provided by human judges. Further improvements are obtained by incorporating models of the subject's native language and by augmenting the recognition networks to include expected pronunciation errors. The various GOP measures are assessed using a specially recorded database of non-native speakers which has been annotated to mark phone-level pronunciation errors. Since pronunciation assessment is highly subjective, a set of four performance measures has been designed, each of them measuring different aspects of how well computer-derived phone-level scores agree with human scores. These performance measures are used to cross-validate the reference annotations and to assess the basic GOP algorithm and its refinements. The experimental results suggest that a likelihood-based pronunciation scoring metric can achieve usable performance, especially after applying the various enhancements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A computer program, QtUCP, has been developed based on several well-established algorithms using GCC 4.0 and Qt (R) 4.0 (Open Source Edition) under Debian GNU/Linux 4.0r0. it can determine the unit-cell parameters from an electron diffraction tilt series obtained from both double-tilt and rotation-tilt holders. In this approach, two or more primitive cells of the reciprocal lattice are determined from experimental data, in the meantime, the measurement errors of the tilt angles are checked and minimized. Subsequently, the derived primitive cells are converted into the reduced form and then transformed into the reduced direct primitive cell. Finally all the patterns are indexed and the least-squares refinement is employed to obtain the optimized results of the lattice parameters. Finally, two examples are given to show the application of the program, one is based on the experiment, the other is from the simulation. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effect of Tb3+ on Ca2+ speciation in human plasma was studied by means of the computer program of MINTEQA2. When Tb3+ ions are not added into the system, Ca2+ ions mostly distribute in free Ca2+ (74.7%) and the surplus distributes in Ca2+ complexes, such as [CaHCO3](+) (7.9%),[Ca(Lac)](+) (6.4%), CaHPO4 (1.3%), [CaHistidinateThreoninateH(3)](3+) (2.4%), [CaCitrateHistidinateH(2)] (2.3%) and CaCO3 (1.1%). Tb3+ can compete with Ca2+ for inorganic as well as biological ligands. An increase of concentration of Tb3+ in the system results in an increase of content of free Ca2+ and a decrease of contents of Ca2+ complexes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objects move, collide, flow, bend, heat up, cool down, stretch, compress and boil. These and other things that cause changes in objects over time are intuitively characterized as processes. To understand common sense physical reasoning and make programs that interact with the physical world as well as people do we must understand qualitative reasoning about processes, when they will occur, their effects, and when they will stop. Qualitative Process theory defines a simple notion of physical process that appears useful as a language in which to write dynamical theories. Reasoning about processes also motivates a new qualitative representation for quantity in terms of inequalities, called quantity space. This report describes the basic concepts of Qualitative Process theory, several different kinds of reasoning that can be performed with them, and discusses its impact on other issues in common sense reasoning about the physical world, such as causal reasoning and measurement interpretation. Several extended examples illustrate the utility of the theory, including figuring out that a boiler can blow up, that an oscillator with friction will eventually stop, and how to say that you can pull with a string but not push with it. This report also describes GIZMO, an implemented computer program which uses Qualitative Process theory to make predictions and interpret simple measurements. The represnetations and algorithms used in GIZMO are described in detail, and illustrated using several examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation is made into the problem of constructing a model of the appearance to an optical input device of scenes consisting of plane-faced geometric solids. The goal is to study algorithms which find the real straight edges in the scenes, taking into account smooth variations in intensity over faces of the solids, blurring of edges and noise. A general mathematical analysis is made of optimal methods for identifying the edge lines in figures, given a raster of intensities covering the entire field of view. There is given in addition a suboptimal statistical decision procedure, based on the model, for the identification of a line within a narrow band on the field of view given an array of intensities from within the band. A computer program has been written and extensively tested which implements this procedure and extracts lines from real scenes. Other programs were written which judge the completeness of extracted sets of lines, and propose and test for additional lines which had escaped initial detection. The performance of these programs is discussed in relation to the theory derived from the model, and with regard to their use of global information in detecting and proposing lines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the problem of controlling or directing the reasoning and actions of a computer program. The basic approach explored is to view reasoning as a species of action, so that a program might apply its reasoning powers to the task of deciding what inferences to make as well as deciding what other actions to take. A design for the architecture of reasoning programs is proposed. This architecture involves self-consciousness, intentional actions, deliberate adaptations, and a form of decision-making based on dialectical argumentation. A program based on this architecture inspects itself, describes aspects of itself, and uses this self-reference and these self-descriptions in making decisions and taking actions. The program's mental life includes awareness of its own concepts, beliefs, desires, intentions, inferences, actions, and skills. All of these are represented by self-descriptions in a single sort of language, so that the program has access to all of these aspects of itself, and can reason about them in the same terms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A computer program, named ADEPT (A Distinctly Empirical Prover of Theorems), has been written which proves theorems taken from the abstract theory of groups. Its operation is basically heuristic, incorporating many of the techniques of the human mathematician in a "natural" way. This program has proved almost 100 theorems, as well as serving as a vehicle for testing and evaluating special-purpose heuristics. A detailed description of the program is supplemented by accounts of its performance on a number of theorems, thus providing many insights into the particular problems inherent in the design of a procedure capable of proving a variety of theorems from this domain. Suggestions have been formulated for further efforts along these lines, and comparisons with related work previously reported in the literature have been made.