948 resultados para Algebraic decoding


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Certain algebraic combinations of single scattering albedo and solar radiation reflected from, or transmitted through, vegetation canopies do not vary with wavelength. These ‘‘spectrally invariant relationships’’ are the consequence of wavelength independence of the extinction coefficient and scattering phase function in veg- etation. In general, this wavelength independence does not hold in the atmosphere, but in cloud-dominated atmospheres the total extinction and total scattering phase function vary only weakly with wavelength. This paper identifies the atmospheric conditions under which the spectrally invariant approximation can accu- rately describe the extinction and scattering properties of cloudy atmospheres. The validity of the as- sumptions and the accuracy of the approximation are tested with 1D radiative transfer calculations using publicly available radiative transfer models: Discrete Ordinate Radiative Transfer (DISORT) and Santa Barbara DISORT Atmospheric Radiative Transfer (SBDART). It is shown for cloudy atmospheres with cloud optical depth above 3, and for spectral intervals that exclude strong water vapor absorption, that the spectrally invariant relationships found in vegetation canopy radiative transfer are valid to better than 5%. The physics behind this phenomenon, its mathematical basis, and possible applications to remote sensing and climate are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examine differential equations where nonlinearity is a result of the advection part of the total derivative or the use of quadratic algebraic constraints between state variables (such as the ideal gas law). We show that these types of nonlinearity can be accounted for in the tangent linear model by a suitable choice of the linearization trajectory. Using this optimal linearization trajectory, we show that the tangent linear model can be used to reproduce the exact nonlinear error growth of perturbations for more than 200 days in a quasi-geostrophic model and more than (the equivalent of) 150 days in the Lorenz 96 model. We introduce an iterative method, purely based on tangent linear integrations, that converges to this optimal linearization trajectory. The main conclusion from this article is that this iterative method can be used to account for nonlinearity in estimation problems without using the nonlinear model. We demonstrate this by performing forecast sensitivity experiments in the Lorenz 96 model and show that we are able to estimate analysis increments that improve the two-day forecast using only four backward integrations with the tangent linear model. Copyright © 2011 Royal Meteorological Society

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of slow vortical dynamics and its role in theoretical understanding is central to geophysical fluid dynamics. It leads, for example, to “potential vorticity thinking” (Hoskins et al. 1985). Mathematically, one imagines an invariant manifold within the phase space of solutions, called the slow manifold (Leith 1980; Lorenz 1980), to which the dynamics are constrained. Whether this slow manifold truly exists has been a major subject of inquiry over the past 20 years. It has become clear that an exact slow manifold is an exceptional case, restricted to steady or perhaps temporally periodic flows (Warn 1997). Thus the concept of a “fuzzy slow manifold” (Warn and Ménard 1986) has been suggested. The idea is that nearly slow dynamics will occur in a stochastic layer about the putative slow manifold. The natural question then is, how thick is this layer? In a recent paper, Ford et al. (2000) argue that Lighthill emission—the spontaneous emission of freely propagating acoustic waves by unsteady vortical flows—is applicable to the problem of balance, with the Mach number Ma replaced by the Froude number F, and that it is a fundamental mechanism for this fuzziness. They consider the rotating shallow-water equations and find emission of inertia–gravity waves at O(F2). This is rather surprising at first sight, because several studies of balanced dynamics with the rotating shallow-water equations have gone beyond second order in F, and found only an exponentially small unbalanced component (Warn and Ménard 1986; Lorenz and Krishnamurthy 1987; Bokhove and Shepherd 1996; Wirosoetisno and Shepherd 2000). We have no technical objection to the analysis of Ford et al. (2000), but wish to point out that it depends crucially on R 1, where R is the Rossby number. This condition requires the ratio of the characteristic length scale of the flow L to the Rossby deformation radius LR to go to zero in the limit F → 0. This is the low Froude number scaling of Charney (1963), which, while originally designed for the Tropics, has been argued to be also relevant to mesoscale dynamics (Riley et al. 1981). If L/LR is fixed, however, then F → 0 implies R → 0, which is the standard quasigeostrophic scaling of Charney (1948; see, e.g., Pedlosky 1987). In this limit there is reason to expect the fuzziness of the slow manifold to be “exponentially thin,” and balance to be much more accurate than is consistent with (algebraic) Lighthill emission.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To date, only one study has investigated educational attainment in poor (reading) comprehenders, providing evidence of poor performance on national UK school tests at age 11 years relative to peers (Cain & Oakhill, 2006). In the present study, we adopted a longitudinal approach, tracking attainment on such tests from 11 years to the end of compulsory schooling in the UK (age 16 years). We aimed to investigate the proposal that educational weaknesses (defined as poor performance on national assessments) might become more pronounced over time, as the curriculum places increasing demands on reading comprehension. Participants comprised 15 poor comprehenders and 15 controls; groups were matched for chronological age, nonverbal reasoning ability and decoding skill. Children were identified at age 9 years using standardised measures of nonverbal reasoning, decoding and reading comprehension. These measures, along with a measure of oral vocabulary knowledge, were repeated at age 11 years. Data on educational attainment were collected from all participants (N = 30) at age 11 and from a subgroup (n = 21) at 16 years. Compared to controls, educational attainment in poor comprehenders was lower at ages 11 and 16 years, an effect that was significant at 11 years. When poor comprehenders were compared to national performance levels, they showed significantly lower performance at both time points. Low educational attainment was not evident for all poor comprehenders. Nonetheless, our findings point to a link between reading comprehension difficulties in mid to late childhood and poor educational outcomes at ages 11 and 16 years. At these ages, pupils in the UK are making key transitions: they move from primary to secondary schools at 11, and out of compulsory schooling at 16.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores a group of Singaporean English language teachers’ knowledge and beliefs about critical literacy as well as their perspectives on how best to teach literacy and critical literacy in Singapore schools. A face-to-face survey was conducted among 58 English language teachers by using open-ended questions. The survey covered various topics related to literacy instruction including text decoding, meaning construction, and critical analysis of texts. The participating teachers believed strongly that reading and writing are transactional and interactional practices. However, they were less certain in their beliefs about teaching critical literacy including the critical, analytical and evaluative aspects of text reading. Some teachers saw a conflict between using time on teaching critical literacy and preparing students to pass their exams. As critical literacy is not a requirement at exams, they found it difficult to justify using time teaching it. The results suggest that the teachers’ belief systems are strongly influenced by the broad macrostructure of the educational system in Singapore and their own educational experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Implicit dynamic-algebraic equations, known in control theory as descriptor systems, arise naturally in many applications. Such systems may not be regular (often referred to as singular). In that case the equations may not have unique solutions for consistent initial conditions and arbitrary inputs and the system may not be controllable or observable. Many control systems can be regularized by proportional and/or derivative feedback.We present an overview of mathematical theory and numerical techniques for regularizing descriptor systems using feedback controls. The aim is to provide stable numerical techniques for analyzing and constructing regular control and state estimation systems and for ensuring that these systems are robust. State and output feedback designs for regularizing linear time-invariant systems are described, including methods for disturbance decoupling and mixed output problems. Extensions of these techniques to time-varying linear and nonlinear systems are discussed in the final section.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let L be a number field and let E/L be an elliptic curve with complex multiplication by the ring of integers O_K of an imaginary quadratic field K. We use class field theory and results of Skorobogatov and Zarhin to compute the transcendental part of the Brauer group of the abelian surface ExE. The results for the odd order torsion also apply to the Brauer group of the K3 surface Kum(ExE). We describe explicitly the elliptic curves E/Q with complex multiplication by O_K such that the Brauer group of ExE contains a transcendental element of odd order. We show that such an element gives rise to a Brauer-Manin obstruction to weak approximation on Kum(ExE), while there is no obstruction coming from the algebraic part of the Brauer group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we investigate the classification of mappings up to K-equivalence. We give several results of this type. We study semialgebraic deformations up to semialgebraic C(0) K-equivalence and bi-Lipschitz K-equivalence. We give an algebraic criterion for bi-Lipschitz K-triviality in terms of semi-integral closure (Theorem 3.5). We also give a new proof of a result of Nishimura: we show that two germs of smooth mappings f, g : R(n) -> R(n), finitely determined with respect to K-equivalence are C(0)-K-equivalent if and only if they have the same degree in absolute value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use an inequality due to Bochnak and Lojasiewicz, which follows from the Curve Selection Lemma of real algebraic geometry in order to prove that, given a C(r) function f : U subset of R(m) -> R, we have lim(y -> xy is an element of crit(f)) vertical bar f(y) - f(x)vertical bar/vertical bar y - x vertical bar(r) = 0, for all x is an element of crit(f)` boolean AND U, where crit( f) = {x is an element of U vertical bar df ( x) = 0}. This shows that the so-called Morse decomposition of the critical set, used in the classical proof of the Morse-Sard theorem, is not necessary: the conclusion of the Morse decomposition lemma holds for the whole critical set. We use this result to give a simple proof of the classical Morse-Sard theorem ( with sharp differentiability assumptions).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present results for the systematic study of reversible-equivariant vector fields - namely, in the simultaneous presence of symmetries and reversing symmetries - by employing algebraic techniques from invariant theory for compact Lie groups. The Hilbert-Poincare series and their associated Molien formulae are introduced,and we prove the character formulae for the computation of dimensions of spaces of homogeneous anti-invariant polynomial functions and reversible-equivariant polynomial mappings. A symbolic algorithm is obtained for the computation of generators for the module of reversible-equivariant polynomial mappings over the ring of invariant polynomials. We show that this computation can be obtained directly from a well-known situation, namely from the generators of the ring of invariants and the module of the equivariants. (C) 2008 Elsevier B.V, All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Moving-least-squares (MLS) surfaces undergoing large deformations need periodic regeneration of the point set (point-set resampling) so as to keep the point-set density quasi-uniform. Previous work by the authors dealt with algebraic MLS surfaces, and proposed a resampling strategy based on defining the new points at the intersections of the MLS surface with a suitable set of rays. That strategy has very low memory requirements and is easy to parallelize. In this article new resampling strategies with reduced CPU-time cost are explored. The basic idea is to choose as set of rays the lines of a regular, Cartesian grid, and to fully exploit this grid: as data structure for search queries, as spatial structure for traversing the surface in a continuation-like algorithm, and also as approximation grid for an interpolated version of the MLS surface. It is shown that in this way a very simple and compact resampling technique is obtained, which cuts the resampling cost by half with affordable memory requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Partition of Unity Implicits (PUI) has been recently introduced for surface reconstruction from point clouds. In this work, we propose a PUI method that employs a set of well-observed solutions in order to produce geometrically pleasant results without requiring time consuming or mathematically overloaded computations. One feature of our technique is the use of multivariate orthogonal polynomials in the least-squares approximation, which allows the recursive refinement of the local fittings in terms of the degree of the polynomial. However, since the use of high-order approximations based only on the number of available points is not reliable, we introduce the concept of coverage domain. In addition, the method relies on the use of an algebraically defined triangulation to handle two important tasks in PUI: the spatial decomposition and an adaptive polygonization. As the spatial subdivision is based on tetrahedra, the generated mesh may present poorly-shaped triangles that are improved in this work by means a specific vertex displacement technique. Furthermore, we also address sharp features and raw data treatment. A further contribution is based on the PUI locality property that leads to an intuitive scheme for improving or repairing the surface by means of editing local functions.