923 resultados para FRACTAL DESCRIPTORS
Resumo:
Dynamic scaling and fractal behaviour of spinodal phase separation is studied in a binary polymer mixture of poly(methyl methacrylate) (PMMA) and poly(styrene-co-acrylonitrile) (SAN). In the later stages of spinodal phase separation, a simple dynamic scaling law was found for the scattering function S(q,t):S(q,t) approximately q(m)-3S approximately (q/q(m)). The possibility of using fractal theory to describe the complex morphology of spinodal phase separation is discussed. In phase separation, morphology exhibits strong self-similarity. The two-dimensional image obtained by optical microscopy can be analysed within the framework of fractal concepts. The results give a fractal dimension of 1.64. This implies that the fractal structure may be the reason for the dynamic scaling behaviour of the structure function.
Resumo:
Fractal behaviour of ramified domains in the late stage of spinodal phase separation in a binary polymer blend of poly(vinyl acetate) with poly(methyl methacrylate) was investigated by optical microscopic method. In the late stage of the spinodal decomposition, the fractal dimension D is about 1.64. It implies that some anomalous properties of irregular structure probably may be explained by fractal concepts.
Resumo:
O objetivo deste estudo foi explorar a aplicabilidade da teoria de fractais no estudo da variabilidade espacial em agregação de solo. A dimensão fractal D junto com RL que é o parâmetro que estima o tamanho do maior agregado foram usados como descritores de fragmentação. Estes valores estimados em diferentes locais na área experimental, foram interpolados usando krigagem ordinária e mapas de isolinhas construídos.
Resumo:
R. Zwiggelaar and C.R. Bull, 'Optical determination of fractal dimensions using Fourier transforms', Optical Engineering 34 (5), 1325-1332 (1995)
Resumo:
The World Wide Web (WWW or Web) is growing rapidly on the Internet. Web users want fast response time and easy access to a enormous variety of information across the world. Thus, performance is becoming a main issue in the Web. Fractals have been used to study fluctuating phenomena in many different disciplines, from the distribution of galaxies in astronomy to complex physiological control systems. The Web is also a complex, irregular, and random system. In this paper, we look at the document reference pattern at Internet Web servers and use fractal-based models to understand aspects (e.g. caching schemes) that affect the Web performance.
Resumo:
Establishing correspondences among object instances is still challenging in multi-camera surveillance systems, especially when the cameras’ fields of view are non-overlapping. Spatiotemporal constraints can help in solving the correspondence problem but still leave a wide margin of uncertainty. One way to reduce this uncertainty is to use appearance information about the moving objects in the site. In this paper we present the preliminary results of a new method that can capture salient appearance characteristics at each camera node in the network. A Latent Dirichlet Allocation (LDA) model is created and maintained at each node in the camera network. Each object is encoded in terms of the LDA bag-of-words model for appearance. The encoded appearance is then used to establish probable matching across cameras. Preliminary experiments are conducted on a dataset of 20 individuals and comparison against Madden’s I-MCHR is reported.
Resumo:
The objective of spatial downscaling strategies is to increase the information content of coarse datasets at smaller scales. In the case of quantitative precipitation estimation (QPE) for hydrological applications, the goal is to close the scale gap between the spatial resolution of coarse datasets (e.g., gridded satellite precipitation products at resolution L × L) and the high resolution (l × l; L»l) necessary to capture the spatial features that determine spatial variability of water flows and water stores in the landscape. In essence, the downscaling process consists of weaving subgrid-scale heterogeneity over a desired range of wavelengths in the original field. The defining question is, which properties, statistical and otherwise, of the target field (the known observable at the desired spatial resolution) should be matched, with the caveat that downscaling methods be as a general as possible and therefore ideally without case-specific constraints and/or calibration requirements? Here, the attention is focused on two simple fractal downscaling methods using iterated functions systems (IFS) and fractal Brownian surfaces (FBS) that meet this requirement. The two methods were applied to disaggregate spatially 27 summertime convective storms in the central United States during 2007 at three consecutive times (1800, 2100, and 0000 UTC, thus 81 fields overall) from the Tropical Rainfall Measuring Mission (TRMM) version 6 (V6) 3B42 precipitation product (~25-km grid spacing) to the same resolution as the NCEP stage IV products (~4-km grid spacing). Results from bilinear interpolation are used as the control. A fundamental distinction between IFS and FBS is that the latter implies a distribution of downscaled fields and thus an ensemble solution, whereas the former provides a single solution. The downscaling effectiveness is assessed using fractal measures (the spectral exponent β, fractal dimension D, Hurst coefficient H, and roughness amplitude R) and traditional operational scores statistics scores [false alarm rate (FR), probability of detection (PD), threat score (TS), and Heidke skill score (HSS)], as well as bias and the root-mean-square error (RMSE). The results show that both IFS and FBS fractal interpolation perform well with regard to operational skill scores, and they meet the additional requirement of generating structurally consistent fields. Furthermore, confidence intervals can be directly generated from the FBS ensemble. The results were used to diagnose errors relevant for hydrometeorological applications, in particular a spatial displacement with characteristic length of at least 50 km (2500 km2) in the location of peak rainfall intensities for the cases studied. © 2010 American Meteorological Society.
Resumo:
Se trabajará mediante el método Aula – Taller con guías de trabajos prácticos que inducirán a los docentes a investigar en Cabri los temas a desarrollar. El taller está dirigido para docentes de nivel medio, terciario que deseen incorporar el relevante tema de Fractales en la curricula Mediante la observación de un video sobre Fractales y lectura de textos sobre el tema se invitará a los asistentes a recorren este nuevo mundo que permite desde la simplicidad de un elemento geométrico llegar a formas intrincadas y enigmáticas.
Resumo:
Presentamos una actividad que relaciona los fractales, y más concretamente la dimensión fractal, con las ciudades. Se realiza una breve incursión en el concepto de fractal y dimensión fractal para pasar posteriormente a una ejemplificación y una propuesta de trabajo en el que mostramos un posible orden en los pasos a seguir para estimar la dimensión fractal del contorno de una ciudad. Mostramos los resultados obtenidos por alumnos de 4º de ESO en el cálculo de la dimensión fractal del contorno de las localidades a las que pertenecen los alumnos del centro con el objetivo de comparar la “rugosidad” de todas ellas.
Resumo:
La primera parte se dedicó al concepto de fractal, su dimensión y la generación de algunos tipos de fractales (determinista lineales y sistemas de funciones iteradas) y se hizo un estudio exhaustivo del triángulo de Sierpinski. Continuamos aquí con otras formas de generar fractales.
Resumo:
En este trabajo se ofrece una visión general de la geometría fractal y sus aplicaciones. Se hace un análisis de sus posibilidades didácticas mediante una recopilación, síntesis y adaptación de sus principales conceptos, de forma que sean adsequibles a los alumnos de secundaria. Consta de dos partes, este primer artículo se dedica fundamentalmente al concepto de fractal, su dimensión y la generación de algunos tipos de fractales, a través de actividades pensadas especialmente para los alumnos de esa etapa.
Resumo:
Fractal image compression is a relatively recent image compression method. Its extension to a sequence of motion images is important in video compression applications. There are two basic fractal compression methods, namely the cube-based and the frame-based methods, being commonly used in the industry. However there are advantages and disadvantages in both methods. This paper proposes a hybrid algorithm highlighting the advantages of the two methods in order to produce a good compression algorithm for video industry. Experimental results show the hybrid algorithm improves the compression ratio and the quality of decompressed images.
Resumo:
Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.
Resumo:
Fractal image compression is a relatively recent image compression method, which is simple to use and often leads to a high compression ratio. These advantages make it suitable for the situation of a single encoding and many decoding, as required in video on demand, archive compression, etc. There are two fundamental fractal compression methods, namely, the cube-based and the frame-based methods, being commonly studied. However, there are advantages and disadvantages in both methods. This paper gives an extension of the fundamental compression methods based on the concept of adaptive partition. Experimental results show that the algorithms based on adaptive partition may obtain a much higher compression ratio compared to algorithms based on fixed partition while maintaining the quality of decompressed images.
Resumo:
The intrinsic independent features of the optimal codebook cubes searching process in fractal video compression systems are examined and exploited. The design of a suitable parallel algorithm reflecting the concept is presented. The Message Passing Interface (MPI) is chosen to be the communication tool for the implementation of the parallel algorithm on distributed memory parallel computers. Experimental results show that the parallel algorithm is able to reduce the compression time and achieve a high speed-up without changing the compression ratio and the quality of the decompressed image. A scalability test was also performed, and the results show that this parallel algorithm is scalable.