960 resultados para Spatial models
Resumo:
This study aimed to evaluate the spatial variability of leaf content of macro and micronutrients. The citrus plants orchard with 5 years of age, planted at regular intervals of 8 x 7 m, was managed under drip irrigation. Leaf samples were collected from each plant to be analyzed in the laboratory. Data were analyzed using the software R, version 2.5.1 Copyright (C) 2007, along with geostatistics package GeoR. All contents of macro and micronutrients studied were adjusted to normal distribution and showed spatial dependence.The best-fit models, based on the likelihood, for the macro and micronutrients were the spherical and matern. It is suggest for the macronutrients nitrogen, phosphorus, potassium, calcium, magnesium and sulfur the minimum distances between samples of 37; 58; 29; 63; 46 and 15 m respectively, while for the micronutrients boron, copper, iron, manganese and zinc, the distances suggests are 29; 9; 113; 35 and 14 m, respectively.
Resumo:
The air dry-bulb temperature (t db),as well as the black globe humidity index (BGHI), exert great influence on the development of broiler chickens during their heating phase. Therefore, the aim of this study was to analyze the structure and the magnitude of the t db and BGHI spatial variability, using geostatistics tools such as semivariogram analysis and also producing kriging maps. The experiment was conducted in the west mesoregion of the states of Minas Gerais in 2010, in a commercial broiler house with heating system consisting of two furnaces that heat the air indirectly, in the firsts 14 days of the birds' life. The data were registered at intervals of five minutes in the period from 8 a.m. to 10 a.m. The variables were evaluated by variograms fitted by residual maximum likelihood (REML) testing the Spherical and Exponential models. Kriging maps were generated based on the best model used to fit the variogram. It was possible to characterize the variability of the t db and BGHI, which allowed observing the spatial dependence by using geostatistics techniques. In addition, the use of geostatistics and distribution maps made possible to identify problems in the heating system in regions inside the broiler house that may harm the development of chicks.
Management zones using fuzzy clustering based on spatial-temporal variability of soil and corn yield
Resumo:
Clustering soil and crop data can be used as a basis for the definition of management zones because the data are grouped into clusters based on the similar interaction of these variables. Therefore, the objective of this study was to identify management zones using fuzzy c-means clustering analysis based on the spatial and temporal variability of soil attributes and corn yield. The study site (18 by 250-m in size) was located in Jaboticabal, São Paulo/Brazil. Corn yield was measured in one hundred 4.5 by 10-m cells along four parallel transects (25 observations per transect) over five growing seasons between 2001 and 2010. Soil chemical and physical attributes were measured. SAS procedure MIXED was used to identify which variable(s) most influenced the spatial variability of corn yield over the five study years. Basis saturation (BS) was the variable that better related to corn yield, thus, semivariograms models were fitted for BS and corn yield and then, data values were krigged. Management Zone Analyst software was used to carry out the fuzzy c-means clustering algorithm. The optimum number of management zones can change over time, as well as the degree of agreement between the BS and corn yield management zone maps. Thus, it is very important take into account the temporal variability of crop yield and soil attributes to delineate management zones accurately.
Resumo:
Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multipinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target’s radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals.
Differential effects of aging on spatial contrast sensitivity to linear and polar sine-wave gratings
Resumo:
Changes in visual function beyond high-contrast acuity are known to take place during normal aging. We determined whether sensitivity to linear sine-wave gratings and to an elementary stimulus preferentially processed in extrastriate areas could be distinctively affected by aging. We measured spatial contrast sensitivity twice for concentric polar (Bessel) and vertical linear gratings of 0.6, 2.5, 5, and 20 cycles per degree (cpd) in two age groups (20-30 and 60-70 years). All participants were free of identifiable ocular disease and had normal or corrected-to-normal visual acuity. Participants were more sensitive to Cartesian than to polar gratings in all frequencies tested, and the younger adult group was more sensitive to all stimuli tested. Significant differences between sensitivities of the two groups were found for linear (only 20 cpd; P<0.01) and polar gratings (all frequencies tested; P<0.01). The young adult group was significantly more sensitive to linear than to circular gratings in the 20 cpd frequency. The older adult group was significantly more sensitive to linear than to circular gratings in all spatial frequencies, except in the 20 cpd frequency. The results suggest that sensitivity to the two kinds of stimuli is affected differently by aging. We suggest that neural changes in the aging brain are important determinants of this difference and discuss the results according to current models of human aging.
Hydraulic and fluvial geomorphological models for a bedrock channel reach of the Twenty Mile Creek /
Resumo:
Bedrock channels have been considered challenging geomorphic settings for the application of numerical models. Bedrock fluvial systems exhibit boundaries that are typically less mobile than alluvial systems, yet they are still dynamic systems with a high degree of spatial and temporal variability. To understand the variability of fluvial systems, numerical models have been developed to quantify flow magnitudes and patterns as the driving force for geomorphic change. Two types of numerical model were assessed for their efficacy in examining the bedrock channel system consisting of a high gradient portion of the Twenty Mile Creek in the Niagara Region of Ontario, Canada. A one-dimensional (1-D) flow model that utilizes energy equations, HEC RAS, was used to determine velocity distributions through the study reach for the mean annual flood (MAF), the 100-year return flood and the 1,000-year return flood. A two-dimensional (2-D) flow model that makes use of Navier-Stokes equations, RMA2, was created with the same objectives. The 2-D modeling effort was not successful due to the spatial complexity of the system (high slope and high variance). The successful 1 -D model runs were further extended using very high resolution geospatial interpolations inherent to the HEC RAS extension, HEC geoRAS. The modeled velocity data then formed the basis for the creation of a geomorphological analysis that focused upon large particles (boulders) and the forces needed to mobilize them. Several existing boulders were examined by collecting detailed measurements to derive three-dimensional physical models for the application of fluid and solid mechanics to predict movement in the study reach. An imaginary unit cuboid (1 metre by 1 metre by 1 metre) boulder was also envisioned to determine the general propensity for the movement of such a boulder through the bedrock system. The efforts and findings of this study provide a standardized means for the assessment of large particle movement in a bedrock fluvial system. Further efforts may expand upon this standardization by modeling differing boulder configurations (platy boulders, etc.) at a high level of resolution.
Resumo:
Basic relationships between certain regions of space are formulated in natural language in everyday situations. For example, a customer specifies the outline of his future home to the architect by indicating which rooms should be close to each other. Qualitative spatial reasoning as an area of artificial intelligence tries to develop a theory of space based on similar notions. In formal ontology and in ontological computer science, mereotopology is a first-order theory, embodying mereological and topological concepts, of the relations among wholes, parts, parts of parts, and the boundaries between parts. We shall introduce abstract relation algebras and present their structural properties as well as their connection to algebras of binary relations. This will be followed by details of the expressiveness of algebras of relations for region based models. Mereotopology has been the main basis for most region based theories of space. Since its earliest inception many theories have been proposed for mereotopology in artificial intelligence among which Region Connection Calculus is most prominent. The expressiveness of the region connection calculus in relational logic is far greater than its original eight base relations might suggest. In the thesis we formulate ways to automatically generate representable relation algebras using spatial data based on region connection calculus. The generation of new algebras is a two pronged approach involving splitting of existing relations to form new algebras and refinement of such newly generated algebras. We present an implementation of a system for automating aforementioned steps and provide an effective and convenient interface to define new spatial relations and generate representable relational algebras.
Resumo:
Artifacts made by humans, such as items of furniture and houses, exhibit an enormous amount of variability in shape. In this paper, we concentrate on models of the shapes of objects that are made up of fixed collections of sub-parts whose dimensions and spatial arrangement exhibit variation. Our goals are: to learn these models from data and to use them for recognition. Our emphasis is on learning and recognition from three-dimensional data, to test the basic shape-modeling methodology. In this paper we also demonstrate how to use models learned in three dimensions for recognition of two-dimensional sketches of objects.
Resumo:
A version of Matheron’s discrete Gaussian model is applied to cell composition data. The examples are for map patterns of felsic metavolcanics in two different areas. Q-Q plots of the model for cell values representing proportion of 10 km x 10 km cell area underlain by this rock type are approximately linear, and the line of best fit can be used to estimate the parameters of the model. It is also shown that felsic metavolcanics in the Abitibi area of the Canadian Shield can be modeled as a fractal
Resumo:
We look at at the empirical validity of Schelling’s models for racial residential segregation applied to the case of Chicago. Most of the empirical literature has focused exclusively the single neighborhood model, also known as the tipping point model and neglected a multineighborhood approach or a unified approach. The multi-neighborhood approach introduced spatial interaction across the neighborhoods, in particular we look at spatial interaction across neighborhoods sharing a border. An initial exploration of the data indicates that spatial contiguity might be relevant to properly analyse the so call tipping phenomena of predominately non-Hispanic white neighborhoods to predominantly minority neighborhoods within a decade. We introduce an econometric model that combines an approach to estimate tipping point using threshold effects and a spatial autoregressive model. The estimation results from the model disputes the existence of a tipping point, that is a discontinuous change in the rate of growth of the non-Hispanic white population due to a small increase in the minority share of the neighborhood. In addition we find that racial distance between the neighborhood of interest and it surrounding neighborhoods has an important effect on the dynamics of racial segregation in Chicago.
Resumo:
Detailed knowledge of waterfowl abundance and distribution across Canada is lacking, which limits our ability to effectively conserve and manage their populations. We used 15 years of data from an aerial transect survey to model the abundance of 17 species or species groups of ducks within southern and boreal Canada. We included 78 climatic, hydrological, and landscape variables in Boosted Regression Tree models, allowing flexible response curves and multiway interactions among variables. We assessed predictive performance of the models using four metrics and calculated uncertainty as the coefficient of variation of predictions across 20 replicate models. Maps of predicted relative abundance were generated from resulting models, and they largely match spatial patterns evident in the transect data. We observed two main distribution patterns: a concentrated prairie-parkland distribution and a more dispersed pan-Canadian distribution. These patterns were congruent with the relative importance of predictor variables and model evaluation statistics among the two groups of distributions. Most species had a hydrological variable as the most important predictor, although the specific hydrological variable differed somewhat among species. In some cases, important variables had clear ecological interpretations, but in some instances, e.g., topographic roughness, they may simply reflect chance correlations between species distributions and environmental variables identified by the model-building process. Given the performance of our models, we suggest that the resulting prediction maps can be used in future research and to guide conservation activities, particularly within the bounds of the survey area.
Resumo:
Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.
Resumo:
Cascade is a multi-institution project studying the temporal and spatial organization of tropical convective systems. While cloud resolving numerical models can reproduce the observed diurnal cycle of such systems they are sensitive to the chosen resolution. As part of this effort, we are comparing results from the Met. Office Unified Model to data from the Global Earth Radiation Budget satellite instrument over the African Monsoon Interdisciplinary Analyses region of North Africa. We use a variety of mathematical techniques to study the outgoing radiation and the evolution of properties such as the cloud size distribution. The effectiveness of various model resolutions is tested with a view to determining the optimum balance between resolution and the need to reproduce the observations.