943 resultados para Arithmetic.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have developed a compiler for the lexically-scoped dialect of LISP known as SCHEME. The compiler knows relatively little about specific data manipulation primitives such as arithmetic operators, but concentrates on general issues of environment and control. Rather than having specialized knowledge about a large variety of control and environment constructs, the compiler handles only a small basis set which reflects the semantics of lambda-calculus. All of the traditional imperative constructs, such as sequencing, assignment, looping, GOTO, as well as many standard LISP constructs such as AND, OR, and COND, are expressed in macros in terms of the applicative basis set. A small number of optimization techniques, coupled with the treatment of function calls as GOTO statements, serve to produce code as good as that produced by more traditional compilers. The macro approach enables speedy implementation of new constructs as desired without sacrificing efficiency in the generated code. A fair amount of analysis is devoted to determining whether environments may be stack-allocated or must be heap-allocated. Heap-allocated environments are necessary in general because SCHEME (unlike Algol 60 and Algol 68, for example) allows procedures with free lexically scoped variables to be returned as the values of other procedures; the Algol stack-allocation environment strategy does not suffice. The methods used here indicate that a heap-allocating generalization of the "display" technique leads to an efficient implementation of such "upward funargs". Moreover, compile-time optimization and analysis can eliminate many "funargs" entirely, and so far fewer environment structures need be allocated at run time than might be expected. A subset of SCHEME (rather than triples, for example) serves as the representation intermediate between the optimized SCHEME code and the final output code; code is expressed in this subset in the so-called continuation-passing style. As a subset of SCHEME, it enjoys the same theoretical properties; one could even apply the same optimizer used on the input code to the intermediate code. However, the subset is so chosen that all temporary quantities are made manifest as variables, and no control stack is needed to evaluate it. As a result, this apparently applicative representation admits an imperative interpretation which permits easy transcription to final imperative machine code. These qualities suggest that an applicative language like SCHEME is a better candidate for an UNCOL than the more imperative candidates proposed to date.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The digital divide has been, at least until very recently, a major theme in policy as well as interdisciplinary academic circles across the world, as well as at a collective global level, as attested by the World Summit on the Information Society. Numerous research papers and volumes have attempted to conceptualise the digital divide and to offer reasoned prescriptive and normative responses. What has been lacking in many of these studies, it is submitted, is a rigorous negotiation of moral and political philosophy, the result being a failure to situate the digital divide - or rather, more widely, information imbalances - in a holistic understanding of social structures of power and wealth. In practice, prescriptive offerings have been little more than philanthropic in tendency, whether private or corporate philanthropy. Instead, a theory of distributive justice is required, one that recovers the tradition of emancipatory, democratic struggle. This much has been said before. What is new here, however, is that the paper suggests a specific formula, the Rawls-Tawney theorem, as a solution at the level of analytical moral-political philosophy. Building on the work of John Rawls and R. H. Tawney, this avoids both the Charybdis of Marxism and the Scylla of liberalism. It delineates some of the details of the meaning of social justice in the information age. Promulgating a conception of isonomia, which while egalitarian eschews arithmetic equality (the equality of misery), the paper hopes to contribute to the emerging ideal of communicative justice in the media-saturated, post-industrial epoch.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The digital divide continues to challenge political and academic circles worldwide. A range of policy solutions is briefly evaluated, from laissez-faire on the right to “arithmetic egalitarianism on the left. The article recasts the digital divide as a problem for the social distribution of presumptively important information (e.g., electoral data, news, science) within postindustrial society. Endorsing in general terms the left-liberal approach of differential or “geometric” egalitarianism, it seeks to invest this with greater precision, and therefore utility, by means of a possibly original synthesis of the ideas of John Rawls and R. H. Tawney. It is argued that, once certain categories of information are accorded the status of “primary goods,” their distribution must then comply with principles of justice as articulated by those major 20th century exponents of ethical social democracy. The resultant Rawls-Tawney theorem, if valid, might augment the portfolio of options for interventionist information policy in the 21st century

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mavron, Vassili; McDonough, T.P.; Schrikhande, M.S., (2003) 'Quasi -symmetric designs with good blocks and intersection number one', Designs Codes and Cryptography 28(2) pp.147-162 RAE2008

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In work that involves mathematical rigor, there are numerous benefits to adopting a representation of models and arguments that can be supplied to a formal reasoning or verification system: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, accessibility has not been a priority in the design of formal verification tools that can provide these benefits. In earlier work [Lap09a], we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. This work expands one aspect of the earlier work by considering more extensively an essential capability for any formal reasoning system whose design is oriented around simulating the natural context: native support for a collection of mathematical relations that deal with common constructs in arithmetic and set theory. We provide a formal definition for a context of relations that can be used to both validate and assist formal reasoning activities. We provide a proof that any algorithm that implements this formal structure faithfully will necessary converge. Finally, we consider the efficiency of an implementation of this formal structure that leverages modular implementations of well-known data structures: balanced search trees and transitive closures of hypergraphs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lidar is an optical remote sensing instrument that can measure atmospheric parameters. A Raman lidar instrument (UCLID) was established at University College Cork to contribute to the European lidar network, EARLINET. System performance tests were carried out to ensure strict data quality assurance for submission to the EARLINET database. Procedures include: overlap correction, telecover test, Rayleigh test and zero bin test. Raman backscatter coefficients, extinction coefficients and lidar ratio were measured from April 2010 to May 2011 and February 2012 to June 2012. Statistical analysis of the profiles over these periods provided new information about the typical atmospheric scenarios over Southern Ireland in terms of aerosol load in the lower troposphere, the planetary boundary layer (PBL) height, aerosol optical density (AOD) at 532 nm and lidar ratio values. The arithmetic average of the PBL height was found to be 608 ± 138 m with a median of 615 m, while average AOD at 532 nm for clean marine air masses was 0.119 ± 0.023 and for polluted air masses was 0.170 ± 0.036. The lidar ratio showed a seasonal dependence with lower values found in winter and autumn (20 ± 5 sr) and higher during spring and winter (30 ± 12 sr). Detection of volcanic particles from the eruption of the volcano Eyjafjallajökull in Iceland was measured between 21 April and 7 May 2010. The backscatter coefficient of the ash layer varied between 2.5 Mm-1sr-1 and 3.5 Mm-1sr-1, and estimation of the AOD at 532 nm was found to be between 0.090 and 0.215. Several aerosol loads due to Saharan dust particles were detected in Spring 2011 and 2012. Lidar ratio of the dust layers were determine to be between 45 and 77 sr and AOD at 532 nm during the dust events range between 0.84 to 0.494.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Great demand in power optimized devices shows promising economic potential and draws lots of attention in industry and research area. Due to the continuously shrinking CMOS process, not only dynamic power but also static power has emerged as a big concern in power reduction. Other than power optimization, average-case power estimation is quite significant for power budget allocation but also challenging in terms of time and effort. In this thesis, we will introduce a methodology to support modular quantitative analysis in order to estimate average power of circuits, on the basis of two concepts named Random Bag Preserving and Linear Compositionality. It can shorten simulation time and sustain high accuracy, resulting in increasing the feasibility of power estimation of big systems. For power saving, firstly, we take advantages of the low power characteristic of adiabatic logic and asynchronous logic to achieve ultra-low dynamic and static power. We will propose two memory cells, which could run in adiabatic and non-adiabatic mode. About 90% dynamic power can be saved in adiabatic mode when compared to other up-to-date designs. About 90% leakage power is saved. Secondly, a novel logic, named Asynchronous Charge Sharing Logic (ACSL), will be introduced. The realization of completion detection is simplified considerably. Not just the power reduction improvement, ACSL brings another promising feature in average power estimation called data-independency where this characteristic would make power estimation effortless and be meaningful for modular quantitative average case analysis. Finally, a new asynchronous Arithmetic Logic Unit (ALU) with a ripple carry adder implemented using the logically reversible/bidirectional characteristic exhibiting ultra-low power dissipation with sub-threshold region operating point will be presented. The proposed adder is able to operate multi-functionally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gemstone Team Cognitive Training

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove that the first complex homology of the Johnson subgroup of the Torelli group Tg is a non-trivial, unipotent Tg-module for all g ≥ 4 and give an explicit presentation of it as a Sym H 1(Tg,C)-module when g ≥ 6. We do this by proving that, for a finitely generated group G satisfying an assumption close to formality, the triviality of the restricted characteristic variety implies that the first homology of its Johnson kernel is a nilpotent module over the corresponding Laurent polynomial ring, isomorphic to the infinitesimal Alexander invariant of the associated graded Lie algebra of G. In this setup, we also obtain a precise nilpotence test. © European Mathematical Society 2014.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pattern generalization is considered one of the prominent routes for in-troducing students to algebra. However, not all generalizations are al-gebraic. In the use of pattern generalization as a route to algebra, we —teachers and educators— thus have to remain vigilant in order not to confound algebraic generalizations with other forms of dealing with the general. But how to distinguish between algebraic and non-algebraic generalizations? On epistemological and semiotic grounds, in this arti-cle I suggest a characterization of algebraic generalizations. This char-acterization helps to bring about a typology of algebraic and arithmetic generalizations. The typology is illustrated with classroom examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evolutionary conflicts among social hymenopteran nestmates are theoretically likely to arise over the production of males and the sex ratio. Analysis of these conflicts has become an important focus of research into the role of kin selection in shaping social traits of hymenopteran colonies. We employ microsatellite analysis of nestmates of one social hymenopteran, the primitively eusocial and monogynous bumblebee Bombus hypnorum, to evaluate these conflicts. In our 14 study colonies, B. hypnorum queens mated between one and six times (arithmetic mean 2.5). One male generally predominated, fathering most of the offspring, thus the effective number of matings was substantially lower (1–3.13; harmonic mean 1.26). In addition, microsatellite analysis allowed the detection of alien workers, those who could not have been the offspring of the queen, in approximately half the colonies. Alien workers within the same colony were probably sisters. Polyandry and alien workers resulted in high variation among colonies in their sociogenetic organization. Genetic data were consistent with the view that all males (n = 233 examined) were produced by a colony’s queen. Male parentage was therefore independent of the sociogenetic organization of the colony, suggesting that the queen, and not the workers, was in control of the laying of male-destined eggs. The population-wide sex ratio (fresh weight investment ratio) was weakly female biased. No evidence for colony-level adaptive sex ratio biasing could be detected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the group valued functor G(D) = D*/F*D' where D is a division algebra with center F and D' the commutator subgroup of D*. We show that G has the most important functorial properties of the reduced Whitehead group SK1. We then establish a fundamental connection between this group, its residue version, and relative value group when D is a Henselian division algebra. The structure of G(D) turns out to carry significant information about the arithmetic of D. Along these lines, we employ G(D) to compute the group SK1(D). As an application, we obtain theorems of reduced K-theory which require heavy machinery, as simple examples of our method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel application-specific instruction set processor (ASIP) for use in the construction of modern signal processing systems is presented. This is a flexible device that can be used in the construction of array processor systems for the real-time implementation of functions such as singular-value decomposition (SVD) and QR decomposition (QRD), as well as other important matrix computations. It uses a coordinate rotation digital computer (CORDIC) module to perform arithmetic operations and several approaches are adopted to achieve high performance including pipelining of the micro-rotations, the use of parallel instructions and a dual-bus architecture. In addition, a novel method for scale factor correction is presented which only needs to be applied once at the end of the computation. This also reduces computation time and enhances performance. Methods are described which allow this processor to be used in reduced dimension (i.e., folded) array processor structures that allow tradeoffs between hardware and performance. The net result is a flexible matrix computational processing element (PE) whose functionality can be changed under program control for use in a wider range of scenarios than previous work. Details are presented of the results of a design study, which considers the application of this decomposition PE architecture in a combined SVD/QRD system and demonstrates that a combination of high performance and efficient silicon implementation are achievable. © 2005 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An application specific programmable processor (ASIP) suitable for the real-time implementation of matrix computations such as Singular Value and QR Decomposition is presented. The processor incorporates facilities for the issue of parallel instructions and a dual-bus architecture that are designed to achieve high performance. Internally, it uses a CORDIC module to perform arithmetic operations, with pipelining of the internal recursive loop exploited to multiplex the two independent micro-rotations onto a single piece of hardware. The net result is a flexible processing element whose functionality can be changed under program control, which combines high performance with efficient silicon implementation. This is illustrated through the results of a detailed silicon design study and the applications of the techniques to a combined SVD/QRD system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scientific computation has unavoidable approximations built into its very fabric. One important source of error that is difficult to detect and control is round-off error propagation which originates from the use of finite precision arithmetic. We propose that there is a need to perform regular numerical `health checks' on scientific codes in order to detect the cancerous effect of round-off error propagation. This is particularly important in scientific codes that are built on legacy software. We advocate the use of the CADNA library as a suitable numerical screening tool. We present a case study to illustrate the practical use of CADNA in scientific codes that are of interest to the Computer Physics Communications readership. In doing so we hope to stimulate a greater awareness of round-off error propagation and present a practical means by which it can be analyzed and managed.