9 resultados para Intervals of singularity

em National Center for Biotechnology Information - NCBI


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper deals with pattern recognition of the shape of the boundary of closed figures on the basis of a circular sequence of measurements taken on the boundary at equal intervals of a suitably chosen argument with an arbitrary starting point. A distance measure between two boundaries is defined in such a way that it has zero value when the associated sequences of measurements coincide by shifting the starting point of one of the sequences. Such a distance measure, which is invariant to the starting point of the sequence of measurements, is used in identification or discrimination by the shape of the boundary of a closed figure. The mean shape of a given set of closed figures is defined, and tests of significance of differences in mean shape between populations are proposed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A large part of the pre-Columbian Maya book known as the Dresden Codex is concerned with an exploration of commensurate relationships among celestial cycles and their relationship to other, nonastronomical cycles of cultural interest. As has long been known, pages 43b–45b of the Codex are concerned with the synodic cycle of Mars. New work reported here with another part of the Codex, a complex table on pages 69–74, reveals a concern on the part of the ancient Maya astronomers with the sidereal motion of Mars as well as with its synodic cycle. Two kinds of empiric sidereal intervals of Mars were used, a long one (702 days) that included a retrograde loop and a short one that did not. The use of these intervals, which is indicated by the documents in the Dresden Codex, permitted the tracking of Mars across the zodiac and the relating of its movements to the terrestrial seasons and to the 260-day sacred calendar. While Kepler solved the sidereal problem of Mars by proposing an elliptical heliocentric orbit, anonymous but equally ingenious Maya astronomers discovered a pair of time cycles that not only accurately described the planet's motion, but also related it to other cosmic and terrestrial concerns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paleontological record of the lower and middle Paleozoic Appalachian foreland basin demonstrates an unprecedented level of ecological and morphological stability on geological time scales. Some 70-80% of fossil morphospecies within assemblages persist in similar relative abundances in coordinated packages lasting as long as 7 million years despite evidence for environmental change and biotic disturbances. These intervals of stability are separated by much shorter periods of ecological and evolutionary change. This pattern appears widespread in the fossil record. Existing concepts of the evolutionary process are unable to explain this uniquely paleontological observation of faunawide coordinated stasis. A principle of evolutionary stability that arises from the ecosystem is explored here. We propose that hierarchical ecosystem theory, when extended to geological time scales, can explain long-term paleoecological stability as the result of ecosystem organization in response to high-frequency disturbance. The accompanying stability of fossil morphologies results from "ecological locking," in which selection is seen as a high-rate response of populations that is hierarchically constrained by lower-rate ecological processes. When disturbance exceeds the capacity of the system, ecological crashes remove these higher-level constraints, and evolution is free to proceed at high rates of directional selection during the organization of a new stable ecological hierarchy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It has become clear that many organisms possess the ability to regulate their mutation rate in response to environmental conditions. So the question of finding an optimal mutation rate must be replaced by that of finding an optimal mutation schedule. We show that this task cannot be accomplished with standard population-dynamic models. We then develop a "hybrid" model for populations experiencing time-dependent mutation that treats population growth as deterministic but the time of first appearance of new variants as stochastic. We show that the hybrid model agrees well with a Monte Carlo simulation. From this model, we derive a deterministic approximation, a "threshold" model, that is similar to standard population dynamic models but differs in the initial rate of generation of new mutants. We use these techniques to model antibody affinity maturation by somatic hypermutation. We had previously shown that the optimal mutation schedule for the deterministic threshold model is phasic, with periods of mutation between intervals of mutation-free growth. To establish the validity of this schedule, we now show that the phasic schedule that optimizes the deterministic threshold model significantly improves upon the best constant-rate schedule for the hybrid and Monte Carlo models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Geological, geophysical, and geochemical data support a theory that Earth experienced several intervals of intense, global glaciation (“snowball Earth” conditions) during Precambrian time. This snowball model predicts that postglacial, greenhouse-induced warming would lead to the deposition of banded iron formations and cap carbonates. Although global glaciation would have drastically curtailed biological productivity, melting of the oceanic ice would also have induced a cyanobacterial bloom, leading to an oxygen spike in the euphotic zone and to the oxidative precipitation of iron and manganese. A Paleoproterozoic snowball Earth at 2.4 Giga-annum before present (Ga) immediately precedes the Kalahari Manganese Field in southern Africa, suggesting that this rapid and massive change in global climate was responsible for its deposition. As large quantities of O2 are needed to precipitate this Mn, photosystem II and oxygen radical protection mechanisms must have evolved before 2.4 Ga. This geochemical event may have triggered a compensatory evolutionary branching in the Fe/Mn superoxide dismutase enzyme, providing a Paleoproterozoic calibration point for studies of molecular evolution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Magnetoencephalographic responses recorded from auditory cortex evoked by brief and rapidly successive stimuli differed between adults with poor vs. good reading abilities in four important ways. First, the response amplitude evoked by short-duration acoustic stimuli was stronger in the post-stimulus time range of 150–200 ms in poor readers than in normal readers. Second, response amplitude to rapidly successive and brief stimuli that were identical or that differed significantly in frequency were substantially weaker in poor readers compared with controls, for interstimulus intervals of 100 or 200 ms, but not for an interstimulus interval of 500 ms. Third, this neurological deficit closely paralleled subjects’ ability to distinguish between and to reconstruct the order of presentation of those stimulus sequences. Fourth, the average distributed response coherence evoked by rapidly successive stimuli was significantly weaker in the β- and γ-band frequency ranges (20–60 Hz) in poor readers, compared with controls. These results provide direct electrophysiological evidence supporting the hypothesis that reading disabilities are correlated with the abnormal neural representation of brief and rapidly successive sensory inputs, manifested in this study at the entry level of the cortical auditory/aural speech representational system(s).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

On fine scales, caustics produced with white light show vividly colored diffraction fringes. For caustics described by the elementary catastrophes of singularity theory, the colors are characteristic of the type of singularity. We study the diffraction colors of the fold and cusp catastrophes. The colors can be simulated computationally as the superposition of monochromatic patterns for different wavelengths. Far from the caustic, where the luminosity contrast is negligible, the fringe colors persist; an asymptotic theory explains why. Experiments with caustics produced by refraction through irregular bathroom-window glass show good agreement with theory. Colored fringes near the cusp reveal fine lines that are not present in any of the monochromatic components; these lines are explained in terms of partial decoherence between rays with widely differing path differences.