926 resultados para Elementary Methods In Number Theory
Resumo:
Intuition is an important and under-researched concept in information systems. Prior exploratory research has shown that that there is potential to characterize the use of intuition in academic information systems research. This paper extends this research to all of the available issues of two leading IS journals with the aim of reaching an approximation of theoretical saturation. Specifically, the entire text of MISQ and ISR was reviewed for the years 1990 through 2009 using searchable PDF versions of these publications. All references to intuition were coded on a basis consistent with Grounded Theory, interpreted as a gestalt and represented as a mind-map. In the period 1990-2009, 681 incidents of the use of "intuition", and related terms were found in the articles reviewed, representing a greater range of codes than prior research. In addition, codes were assigned to all issues of MIS Quarterly from commencement of publication to the end of the 2012 publication year to support the conjecture that coding saturation has been approximated. The most prominent use of the term of "intuition" was coded as "Intuition as Authority" in which intuition was used to validate a statement, research objective or a finding; representing approximately 34 per cent of codes assigned. In research articles where mathematical analysis was presented, researchers not infrequently commented on the degree to which a mathematical formulation was "intuitive"; this was the second most common coding representing approximately 16 per cent of the codes. The possibly most impactful use of the term "intuition" was "Intuition as Outcome", representing approximately 7 per cent of all coding, which characterized research results as adding to the intuitive understanding of a research topic or phenomena.This research aims to contribute to a greater theoretical understanding of the use of intuition in academic IS research publications. It provides potential benefits to practitioners by providing insight into the use of intuition in IS management, for example, emphasizing the emerging importance of "intuitive technology". Research directions include the creation of reflective and/or formative constructs for intuition in information systems research and the expansion of this novel research method to additional IS academic publications and topics.
Resumo:
Let M -> B, N -> B be fibrations and f(1), f(2): M -> N be a pair of fibre-preserving maps. Using normal bordism techniques we define an invariant which is an obstruction to deforming the pair f(1), f(2) over B to a coincidence free pair of maps. In the special case where the two fibrations axe the same and one of the maps is the identity, a weak version of our omega-invariant turns out to equal Dold`s fixed point index of fibre-preserving maps. The concepts of Reidemeister classes and Nielsen coincidence classes over B are developed. As an illustration we compute e.g. the minimal number of coincidence components for all homotopy classes of maps between S(1)-bundles over S(1) as well as their Nielsen and Reidemeister numbers.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The paper presents a new methodology to model material failure, in two-dimensional reinforced concrete members, using the Continuum Strong Discontinuity Approach (CSDA). The mixture theory is used as the methodological approach to model reinforced concrete as a composite material, constituted by a plain concrete matrix reinforced with two embedded orthogonal long fiber bundles (rebars). Matrix failure is modeled on the basis of a continuum damage model, equipped with strain softening, whereas the rebars effects are modeled by means of phenomenological constitutive models devised to reproduce the axial non-linear behavior, as well as the bondslip and dowel effects. The proposed methodology extends the fundamental ingredients of the standard Strong Discontinuity Approach, and the embedded discontinuity finite element formulations, in homogeneous materials, to matrix/fiber composite materials, as reinforced concrete. The specific aspects of the material failure modeling for those composites are also addressed. A number of available experimental tests are reproduced in order to illustrate the feasibility of the proposed methodology. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
In the limit of small values of the aspect ratio parameter (or wave steepness) which measures the amplitude of a surface wave in units of its wave-length, a model equation is derived from the Euler system in infinite depth (deep water) without potential flow assumption. The resulting equation is shown to sustain periodic waves which on the one side tend to the proper linear limit at small amplitudes, on the other side possess a threshold amplitude where wave crest peaking is achieved. An explicit expression of the crest angle at wave breaking is found in terms of the wave velocity. By numerical simulations, stable soliton-like solutions (experiencing elastic interactions) propagate in a given velocities range on the edge of which they tend to the peakon solution. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Many years ago Zel'dovich showed how the Lagrange condition in the theory of differential equations can be utilized in the perturbation theory of quantum mechanics. Zel'dovich's method enables us to circumvent the summation over intermediate states. As compared with other similar methods, in particular the logarithmic perturbation expansion method, we emphasize that this relatively unknown method of Zel'dovich has a remarkable advantage in dealing with excited stares. That is, the ground and excited states can all be treated in the same way. The nodes of the unperturbed wavefunction do not give rise to any complication.
Resumo:
Group theoretical-based techniques and fundamental results from number theory are used in order to allow for the construction of exact projectors in finite-dimensional spaces. These operators are shown to make use only of discrete variables, which play the role of discrete generator coordinates, and their application in the number symmetry restoration is carried out in a nuclear BCS wave function which explicitly violates that symmetry. © 1999 Published by Elsevier Science B.V. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Introduction: Advances in biotechnology have shed light on many biological processes. In biological networks, nodes are used to represent the function of individual entities within a system and have historically been studied in isolation. Network structure adds edges that enable communication between nodes. An emerging fieldis to combine node function and network structure to yield network function. One of the most complex networks known in biology is the neural network within the brain. Modeling neural function will require an understanding of networks, dynamics, andneurophysiology. It is with this work that modeling techniques will be developed to work at this complex intersection. Methods: Spatial game theory was developed by Nowak in the context of modeling evolutionary dynamics, or the way in which species evolve over time. Spatial game theory offers a two dimensional view of analyzingthe state of neighbors and updating based on the surroundings. Our work builds upon this foundation by studying evolutionary game theory networks with respect to neural networks. This novel concept is that neurons may adopt a particular strategy that will allow propagation of information. The strategy may therefore act as the mechanism for gating. Furthermore, the strategy of a neuron, as in a real brain, isimpacted by the strategy of its neighbors. The techniques of spatial game theory already established by Nowak are repeated to explain two basic cases and validate the implementation of code. Two novel modifications are introduced in Chapters 3 and 4 that build on this network and may reflect neural networks. Results: The introduction of two novel modifications, mutation and rewiring, in large parametricstudies resulted in dynamics that had an intermediate amount of nodes firing at any given time. Further, even small mutation rates result in different dynamics more representative of the ideal state hypothesized. Conclusions: In both modificationsto Nowak's model, the results demonstrate the network does not become locked into a particular global state of passing all information or blocking all information. It is hypothesized that normal brain function occurs within this intermediate range and that a number of diseases are the result of moving outside of this range.
Relative Predicativity and dependent recursion in second-order set theory and higher-orders theories
Resumo:
This article reports that some robustness of the notions of predicativity and of autonomous progression is broken down if as the given infinite total entity we choose some mathematical entities other than the traditional ω. Namely, the equivalence between normal transfinite recursion scheme and new dependent transfinite recursion scheme, which does hold in the context of subsystems of second order number theory, does not hold in the context of subsystems of second order set theory where the universe V of sets is treated as the given totality (nor in the contexts of those of n+3-th order number or set theories, where the class of all n+2-th order objects is treated as the given totality).
Resumo:
A new research project has, quite recently, been launched to clarify how different, from systems in second order number theory extending ACA 0, those in second order set theory extending NBG (as well as those in n + 3-th order number theory extending the so-called Bernays−Gödel expansion of full n + 2-order number theory etc.) are. In this article, we establish the equivalence between Δ10\bf-LFP and Δ10\bf-FP, which assert the existence of a least and of a (not necessarily least) fixed point, respectively, for positive elementary operators (or between Δn+20\bf-LFP and Δn+20\bf-FP). Our proof also shows the equivalence between ID 1 and ^ID1, both of which are defined in the standard way but with the starting theory PA replaced by ZFC (or full n + 2-th order number theory with global well-ordering).
Resumo:
Increasing evidence indicates that tumor microenvironment (TME) is crucial in tumor survival and metastases. Inflammatory cells accumulate around tumors and strangely appear to be permissive to their growth. One key stroma cell is the mast cell (MC), which can secrete numerous pro- and antitumor molecules. We investigated the presence and degranulation state of MC in pancreatic ductal adenocarcinoma (PDAC) as compared to acute ancreatitis (AP). Three different detection methods: (a) toluidine blue staining, as well as immunohistochemistry for (b) tryptase and (c) c-kit, were utilized to assess the number and extent of degranulation of MC in PDAC tissue (n=7), uninvolved pancreatic tissue derived from tumor-free margins (n=7) and tissue form AP (n=4). The number of MC detected with all three methods was significantly increased in PDAC, as compared to normal pancreatic tissue derived from tumor-free margins (p<0.05). The highest number of MC was identified by c-kit, 22.2∓7.5 per high power field (HPF) in PDAC vs 9.7∓5.1 per HPF in normal tissue. Contrary to MC in AP, where most of the detected MC were found degranulated, MC in PDAC appeared intact. In conclusion, MC are increased in number, but not degranulated in PDAC, suggesting that they may contribute to cancer growth by permitting selective release of pro-tumorogenic molecules.
Resumo:
The objective of this thesis is to study the distribution of the number of principal ideals generated by an irreducible element in an algebraic number field, namely in the non-unique factorization ring of integers of such a field. In particular we are investigating the size of M(x), defined as M ( x ) =∑ (α) α irred.|N (α)|≤≠ 1, where x is any positive real number and N (α) is the norm of α. We finally obtain asymptotic results for hl(x).