942 resultados para Cotes numbers
Resumo:
In this paper two-dimensional (2-D) numerical investigation of flow past four square cylinders in an in-line square configuration are performed using the lattice Boltzmann method. The gap spacing g=s/d is set at 1, 3 and 6 and Reynolds number ranging from Re=60 to 175. We observed four distinct wake patterns: (i) a steady wake pattern (Re=60 and g=1) (ii) a stable shielding wake pattern (80≤Re≤175 and g=1) (iii) a wiggling shielding wake pattern (60≤Re≤175 and g=3) (iv) a vortex shedding wake pattern (60≤Re≤175 and g=6) At g=1, the Reynolds number is observed to have a strong effect on the wake patterns. It is also found that at g=1, the secondary cylinder interaction frequency significantly contributes for drag and lift coefficients signal. It is found that the primary vortex shedding frequency dominates the flow and the role of secondary cylinder interaction frequency almost vanish at g=6. It is observed that the jet between the gaps strongly influenced the wake interaction for different gap spacing and Reynolds number combination. To fully understand the wake transformations the details vorticity contour visualization, power spectra of lift coefficient signal and time signal analysis of drag and lift coefficients also presented in this paper.
Resumo:
Quantum-inspired models have recently attracted increasing attention in Information Retrieval. An intriguing characteristic of the mathematical framework of quantum theory is the presence of complex numbers. However, it is unclear what such numbers could or would actually represent or mean in Information Retrieval. The goal of this paper is to discuss the role of complex numbers within the context of Information Retrieval. First, we introduce how complex numbers are used in quantum probability theory. Then, we examine van Rijsbergen’s proposal of evoking complex valued representations of informations objects. We empirically show that such a representation is unlikely to be effective in practice (confuting its usefulness in Information Retrieval). We then explore alternative proposals which may be more successful at realising the power of complex numbers.
Resumo:
We introduce a function Z(k) which measures the number of distinct ways in which a number can be expressed as the sum of Fibonacci numbers. Using a binary table and other devices, we explore the values that Z(k) can take and reveal a surprising relationship between the values of Z(k) and the Fibonacci numbers from which they were derived. The article shows the way in which standard spreadsheet functionalities makes it possible to reveal quite striking patterns in data.
Resumo:
The television quiz program Letters and Numbers, broadcast on the SBS network, has recently become quite popular in Australia. This paper considers an implementation in Excel 2010 and its potential as a vehicle to showcase a range of mathematical and computing concepts and principles.
Resumo:
The television quiz program Letters and Numbers, broadcast on the SBS network, has recently become quite popular in Australia. This paper explores the potential of this game to illustrate and engage student interest in a range of fundamental concepts of computer science and mathematics. The Numbers Game in particular has a rich mathematical structure whose analysis and solution involves concepts of counting and problem size, discrete (tree) structures, language theory, recurrences, computational complexity, and even advanced memory management. This paper presents an analysis of these games and their teaching applications, and presents some initial results of use in student assignments.
Resumo:
This is an update of an earlier paper, and is written for Excel 2007. A series of Excel 2007 models is described. The more advanced versions allow solution of f(x)=0 by examining change of sign of function values. The function is graphed and change of sign easily detected by a change of colour. Relevant features of Excel 2007 used are Names, Scatter Chart and Conditional Formatting. Several sample Excel 2007 models are available for download, and the paper is intended to be used as a lesson plan for students having some familiarity with derivatives. For comparison and reference purposes, the paper also presents a brief outline of several common equation-solving strategies as an Appendix.
Resumo:
Ever since Cox et. al published their paper, “A Secure, Robust Watermark for Multimedia” in 1996 [6], there has been tremendous progress in multimedia watermarking. The same pattern re-emerged with Agrawal and Kiernan publishing their work “Watermarking Relational Databases” in 2001 [1]. However, little attention has been given to primitive data collections with only a handful works of research known to the authors [11, 10]. This is primarily due to the absence of an attribute that differentiates marked items from unmarked item during insertion and detection process. This paper presents a distribution-independent, watermarking model that is secure against secondary-watermarking in addition to conventional attacks such as data addition, deletion and distortion. The low false positives and high capacity provide additional strength to the scheme. These claims are backed by experimental results provided in the paper.
Resumo:
This paper examines how ideas and practices of accounting come together in turning the abstract concept of climate change into a new non-financial performance measure in a large energy company in the UK. It develops the notion of ‘governmental management’ to explain how the firm’s carbon dioxide emissions were transformed into a new organisational object that could be made quantifiable, measureable and ultimately manageable because of the modern power of accounting in tying disciplinary subjectivities and objectivities together whilst operating simultaneously at the level of individual and the organisation. Examining these interrelations highlights the constitutive nature of accounting in creating not just new categories for accounting’s attention, but in turn new organisational knowledge and knowledge experts in the making up accounting for climate change. Significantly, it appears these new knowledge experts are no longer accountants: which may help explain accounting’s evolution into evermore spheres of influence as we increasingly choose to manage our world ‘by the numbers’.
Resumo:
Researchers spend an average of 38 working days preparing an NHMRC Project Grant proposal, but with success rates of just 15% then over 500 years of researcher went into failed applications in 2014. This time would likely have been better spent on actual research. Many applications are non-competitive and could possibly be culled early, saving time for both researchers and funding agencies. Our analysis of the major health and medical scheme in Australia estimated that 61% of applications were never likely to be funded...
Resumo:
We show that the algebraic intersection number of Scott and Swarup for splittings of free groups Coincides With the geometric intersection number for the sphere complex of the connected sum of copies of S-2 x S-1. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Petrifilm(R) (6410) was used directly on lamb carcasses to enumerate coliforms. 10 sites on 30 carcasses were sampled at each of 4 separate meat processing establishments (works). Coliform counts obtained by this technique were statistically analysed using analysis of variance (ANOVA) to select the optimum sampling sites on the carcass and to assess contamination of the carcass by gut flora at a particular establishment. There was a large variation between sites and between works. In general, works 3 and 4 produced cleaner carcasses than works 2, which in turn was cleaner than works 1. Works 1, 2 and 4 used conventional dressing techniques and works 3 used the inverted dressing method, therefore, the coliform counts found at works 3 and 4 are achievable regardless of dressing technique. Coliform bacteria were most concentrated around the posterior pelvic rim and less prevalent at the carcass extremities. The posterior pelvic rim (sites 3 and 4) had higher (P < 0.05) coliform counts than the exterior ventral flank area (sites 5, 6, 7 and 8), which in turn had higher (P < 0.05) counts than the proximal hind and proximal fore limbs (sites 1, 2, 9 and 10) across all works. With in-line routine testing it is recommended that the majority of carcasses sampled should give coliform counts of <50 cfu/20 cm2 for sites 4 and 8. Reprinted with permission from Journal of Food Protection. Copyright held by the International Association of Food Protection, Des Moines, Iowa, USA. Authors affifiation. J.A.Guthrie & K.J.Dunlop International Food Institute of Queensland, Department of Primary Industries, Rockhampton and G.A.Saunders Veterinary Public Health Division, Livestock and Meat Authority of Queensland, Emerald.
Resumo:
A fast iterative scheme based on the Newton method is described for finding the reciprocal of a finite segment p-adic numbers (Hensel code). The rate of generation of the reciprocal digits per step can be made quadratic or higher order by a proper choice of the starting value and the iterating function. The extension of this method to find the inverse transform of the Hensel code of a rational polynomial over a finite field is also indicated.
Resumo:
The aim of this dissertation is to provide conceptual tools for the social scientist for clarifying, evaluating and comparing explanations of social phenomena based on formal mathematical models. The focus is on relatively simple theoretical models and simulations, not statistical models. These studies apply a theory of explanation according to which explanation is about tracing objective relations of dependence, knowledge of which enables answers to contrastive why and how-questions. This theory is developed further by delineating criteria for evaluating competing explanations and by applying the theory to social scientific modelling practices and to the key concepts of equilibrium and mechanism. The dissertation is comprised of an introductory essay and six published original research articles. The main theses about model-based explanations in the social sciences argued for in the articles are the following. 1) The concept of explanatory power, often used to argue for the superiority of one explanation over another, compasses five dimensions which are partially independent and involve some systematic trade-offs. 2) All equilibrium explanations do not causally explain the obtaining of the end equilibrium state with the multiple possible initial states. Instead, they often constitutively explain the macro property of the system with the micro properties of the parts (together with their organization). 3) There is an important ambivalence in the concept mechanism used in many model-based explanations and this difference corresponds to a difference between two alternative research heuristics. 4) Whether unrealistic assumptions in a model (such as a rational choice model) are detrimental to an explanation provided by the model depends on whether the representation of the explanatory dependency in the model is itself dependent on the particular unrealistic assumptions. Thus evaluating whether a literally false assumption in a model is problematic requires specifying exactly what is supposed to be explained and by what. 5) The question of whether an explanatory relationship depends on particular false assumptions can be explored with the process of derivational robustness analysis and the importance of robustness analysis accounts for some of the puzzling features of the tradition of model-building in economics. 6) The fact that economists have been relatively reluctant to use true agent-based simulations to formulate explanations can partially be explained by the specific ideal of scientific understanding implicit in the practise of orthodox economics.
Resumo:
In the present investigation, various kinds of textures, namely, unidirectional, 8-ground, and random were attained on the die surfaces. Roughness of the textures was varied using different grits of emery papers or polishing powders. Then pins made of Al-4Mg alloys were slid against steel plates at various numbers of cycles, namely 1, 2, 6, 10 and 20 under both dry and lubricated conditions using an inclined pin-on-plate sliding tester. The morphologies of the worn surfaces of the pins and the formation of transfer layer on the counter surfaces were observed using a scanning electron microscope. Surface roughness parameters of the plate were measured using an optical profilometer. It was observed that the coefficient of friction and formation of transfer layer during the first few cycles depend on the die surface textures under both dry and lubricated conditions. It was also observed that under lubricated condition, the coefficient of friction decreases with number of cycles for all kinds of textures. However, under dry condition, it ecreases for unidirectional and 8-ground surfaces while for random surfaces it increases with number of cycles