936 resultados para Algebraic geometric code
Resumo:
Modern methods of spawning new technological motifs are not appropriate when it is desired to realize artificial life as an actual real world entity unto itself (Pattee 1995; Brooks 2006; Chalmers 1995). Many fundamental aspects of such a machine are absent in common methods, which generally lack methodologies of construction. In this paper we mix classical and modern studies in order to attempt to realize an artificial life form from first principles. A model of an algorithm is introduced, its methodology of construction is presented, and the fundamental source from which it sprang is discussed.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Dissolution rates were calculated for a range of grain sizes of anorthite and biotite dissolved under far from equilibrium conditions at pH 3, T = 20 degrees C. Dissolution rates were normalized to initial and final BET surface area, geometric surface area, mass and (for biotite only) geometric edge surface area. Constant (within error) dissolution rates were only obtained by normalizing to initial BET surface area for biotite. The normalizing term that gave the smallest variation about the mean for anorthite was initial BET surface area. In field studies, only current (final) surface area is measurable. In this study, final geometric surface area gave the smallest variation for anorthite dissolution rates and final geometric edge surface area for biotite dissolution rates. (c) 2005 Published by Elsevier B.V.
Resumo:
In this paper I provide a critical discussion of Foucault's work on government and governmentality. I argue that geographers have tended to overlook the ways in which practices of self-government and subjectification are performed in relation to programmes of government, and suggest that they should examine the technical devices which are embedded in networks of government. Drawing upon these observations I suggest how geographers might proceed, tracing the geographies of a specific artefact: the British government's 1958 Motorway Code. I examine how the code was designed to serve as a technology of government that could shape the conduct of fairly mobile and distant subjects, enabling them to govern their conduct and the movements of their vehicles.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.