307 resultados para MULTISCALE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La synthèse d'images dites photoréalistes nécessite d'évaluer numériquement la manière dont la lumière et la matière interagissent physiquement, ce qui, malgré la puissance de calcul impressionnante dont nous bénéficions aujourd'hui et qui ne cesse d'augmenter, est encore bien loin de devenir une tâche triviale pour nos ordinateurs. Ceci est dû en majeure partie à la manière dont nous représentons les objets: afin de reproduire les interactions subtiles qui mènent à la perception du détail, il est nécessaire de modéliser des quantités phénoménales de géométries. Au moment du rendu, cette complexité conduit inexorablement à de lourdes requêtes d'entrées-sorties, qui, couplées à des évaluations d'opérateurs de filtrage complexes, rendent les temps de calcul nécessaires à produire des images sans défaut totalement déraisonnables. Afin de pallier ces limitations sous les contraintes actuelles, il est nécessaire de dériver une représentation multiéchelle de la matière. Dans cette thèse, nous construisons une telle représentation pour la matière dont l'interface correspond à une surface perturbée, une configuration qui se construit généralement via des cartes d'élévations en infographie. Nous dérivons notre représentation dans le contexte de la théorie des microfacettes (conçue à l'origine pour modéliser la réflectance de surfaces rugueuses), que nous présentons d'abord, puis augmentons en deux temps. Dans un premier temps, nous rendons la théorie applicable à travers plusieurs échelles d'observation en la généralisant aux statistiques de microfacettes décentrées. Dans l'autre, nous dérivons une procédure d'inversion capable de reconstruire les statistiques de microfacettes à partir de réponses de réflexion d'un matériau arbitraire dans les configurations de rétroréflexion. Nous montrons comment cette théorie augmentée peut être exploitée afin de dériver un opérateur général et efficace de rééchantillonnage approximatif de cartes d'élévations qui (a) préserve l'anisotropie du transport de la lumière pour n'importe quelle résolution, (b) peut être appliqué en amont du rendu et stocké dans des MIP maps afin de diminuer drastiquement le nombre de requêtes d'entrées-sorties, et (c) simplifie de manière considérable les opérations de filtrage par pixel, le tout conduisant à des temps de rendu plus courts. Afin de valider et démontrer l'efficacité de notre opérateur, nous synthétisons des images photoréalistes anticrenelées et les comparons à des images de référence. De plus, nous fournissons une implantation C++ complète tout au long de la dissertation afin de faciliter la reproduction des résultats obtenus. Nous concluons avec une discussion portant sur les limitations de notre approche, ainsi que sur les verrous restant à lever afin de dériver une représentation multiéchelle de la matière encore plus générale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mesh generation is an important step inmany numerical methods.We present the “HierarchicalGraphMeshing” (HGM)method as a novel approach to mesh generation, based on algebraic graph theory.The HGM method can be used to systematically construct configurations exhibiting multiple hierarchies and complex symmetry characteristics. The hierarchical description of structures provided by the HGM method can be exploited to increase the efficiency of multiscale and multigrid methods. In this paper, the HGMmethod is employed for the systematic construction of super carbon nanotubes of arbitrary order, which present a pertinent example of structurally and geometrically complex, yet highly regular, structures. The HGMalgorithm is computationally efficient and exhibits good scaling characteristics. In particular, it scales linearly for super carbon nanotube structures and is working much faster than geometry-based methods employing neighborhood search algorithms. Its modular character makes it conducive to automatization. For the generation of a mesh, the information about the geometry of the structure in a given configuration is added in a way that relates geometric symmetries to structural symmetries. The intrinsically hierarchic description of the resulting mesh greatly reduces the effort of determining mesh hierarchies for multigrid and multiscale applications and helps to exploit symmetry-related methods in the mechanical analysis of complex structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a general, trainable architecture for object detection that has previously been applied to face and peoplesdetection with a new application to car detection in static images. Our technique is a learning based approach that uses a set of labeled training data from which an implicit model of an object class -- here, cars -- is learned. Instead of pixel representations that may be noisy and therefore not provide a compact representation for learning, our training images are transformed from pixel space to that of Haar wavelets that respond to local, oriented, multiscale intensity differences. These feature vectors are then used to train a support vector machine classifier. The detection of cars in images is an important step in applications such as traffic monitoring, driver assistance systems, and surveillance, among others. We show several examples of car detection on out-of-sample images and show an ROC curve that highlights the performance of our system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new paradigm for signal reconstruction and superresolution, Correlation Kernel Analysis (CKA), that is based on the selection of a sparse set of bases from a large dictionary of class- specific basis functions. The basis functions that we use are the correlation functions of the class of signals we are analyzing. To choose the appropriate features from this large dictionary, we use Support Vector Machine (SVM) regression and compare this to traditional Principal Component Analysis (PCA) for the tasks of signal reconstruction, superresolution, and compression. The testbed we use in this paper is a set of images of pedestrians. This paper also presents results of experiments in which we use a dictionary of multiscale basis functions and then use Basis Pursuit De-Noising to obtain a sparse, multiscale approximation of a signal. The results are analyzed and we conclude that 1) when used with a sparse representation technique, the correlation function is an effective kernel for image reconstruction and superresolution, 2) for image compression, PCA and SVM have different tradeoffs, depending on the particular metric that is used to evaluate the results, 3) in sparse representation techniques, L_1 is not a good proxy for the true measure of sparsity, L_0, and 4) the L_epsilon norm may be a better error metric for image reconstruction and compression than the L_2 norm, though the exact psychophysical metric should take into account high order structure in images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Common Loon (Gavia immer) is considered an emblematic and ecologically important example of aquatic-dependent wildlife in North America. The northern breeding range of Common Loon has contracted over the last century as a result of habitat degradation from human disturbance and lakeshore development. We focused on the state of New Hampshire, USA, where a long-term monitoring program conducted by the Loon Preservation Committee has been collecting biological data on Common Loon since 1976. The Common Loon population in New Hampshire is distributed throughout the state across a wide range of lake-specific habitats, water quality conditions, and levels of human disturbance. We used a multiscale approach to evaluate the association of Common Loon and breeding habitat within three natural physiographic ecoregions of New Hampshire. These multiple scales reflect Common Loon-specific extents such as territories, home ranges, and lake-landscape influences. We developed ecoregional multiscale models and compared them to single-scale models to evaluate model performance in distinguishing Common Loon breeding habitat. Based on information-theoretic criteria, there is empirical support for both multiscale and single-scale models across all three ecoregions, warranting a model-averaging approach. Our results suggest that the Common Loon responds to both ecological and anthropogenic factors at multiple scales when selecting breeding sites. These multiscale models can be used to identify and prioritize the conservation of preferred nesting habitat for Common Loon populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Short-eared Owl (Asio flammeus) is an open-country species breeding in the northern United States and Canada, and has likely experienced a long-term, range-wide, and substantial decline. However, the cause and magnitude of the decline is not well understood. We set forth to address the first two of six previously proposed conservation priorities to be addressed for this species: (1) better define habitat use and (2) improve population monitoring. We recruited 131 volunteers to survey over 6.2 million ha within the state of Idaho for Short-eared Owls during the 2015 breeding season. We surveyed 75 transects, 71 of which were surveyed twice, and detected Short-eared Owls on 27 transects. We performed multiscale occupancy modeling to identify habitat associations, and performed multiscale abundance modeling to generate a state-wide population estimate. Our results suggest that within the state of Idaho, Short-eared Owls are more often found in areas with marshland or riparian habitat or areas with greater amounts of sagebrush habitat at the 1750 ha transect scale. At the 50 ha point scale, Short-eared Owls tend to associate positively with fallow and bare dirt agricultural land and negatively with grassland. Cropland was not chosen at the broader transect scale suggesting that Short-eared Owls may prefer more heterogeneous landscapes. On the surface our results may seem contradictory to the presumed land use by a “grassland” species; however, the grasslands of the Intermountain West, consisting largely of invasive cheatgrass (Bromus tectorum), lack the complex structure shown to be preferred by these owls. We suggest the local adaptation to agriculture represents the next best habitat to their historical native habitat preferences. Regardless, we have confirmed regional differences that should be considered in conservation planning for this species. Last, our results demonstrate the feasibility, efficiency, and effectiveness of utilizing public participation in scientific research to achieve a robust sampling methodology across the broad geography of the Intermountain West.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiscale modeling is emerging as one of the key challenges in mathematical biology. However, the recent rapid increase in the number of modeling methodologies being used to describe cell populations has raised a number of interesting questions. For example, at the cellular scale, how can the appropriate discrete cell-level model be identified in a given context? Additionally, how can the many phenomenological assumptions used in the derivation of models at the continuum scale be related to individual cell behavior? In order to begin to address such questions, we consider a discrete one-dimensional cell-based model in which cells are assumed to interact via linear springs. From the discrete equations of motion, the continuous Rouse [P. E. Rouse, J. Chem. Phys. 21, 1272 (1953)] model is obtained. This formalism readily allows the definition of a cell number density for which a nonlinear "fast" diffusion equation is derived. Excellent agreement is demonstrated between the continuum and discrete models. Subsequently, via the incorporation of cell division, we demonstrate that the derived nonlinear diffusion model is robust to the inclusion of more realistic biological detail. In the limit of stiff springs, where cells can be considered to be incompressible, we show that cell velocity can be directly related to cell production. This assumption is frequently made in the literature but our derivation places limits on its validity. Finally, the model is compared with a model of a similar form recently derived for a different discrete cell-based model and it is shown how the different diffusion coefficients can be understood in terms of the underlying assumptions about cell behavior in the respective discrete models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new man-made target tracking algorithm integrating features from (Forward Looking InfraRed) image sequence is presented based on particle filter. Firstly, a multiscale fractal feature is used to enhance targets in FLIR images. Secondly, the gray space feature is defined by Bhattacharyya distance between intensity histograms of the reference target and a sample target from MFF (Multi-scale Fractal Feature) image. Thirdly, the motion feature is obtained by differencing between two MFF images. Fourthly, a fusion coefficient can be automatically obtained by online feature selection method for features integrating based on fuzzy logic. Finally, a particle filtering framework is developed to fulfill the target tracking. Experimental results have shown that the proposed algorithm can accurately track weak or small man-made target in FLIR images with complicated background. The algorithm is effective, robust and satisfied to real time tracking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec−1) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research into understanding bacterial chemotactic systems has become a paradigm for Systems Biology. Experimental and theoretical researchers have worked hand-in-hand for over 40 years to understand the intricate behavior driving bacterial species, in particular how such small creatures, usually not more than 5 µm in length, detect and respond to small changes in their extracellular environment. In this review we highlight the importance that theoretical modeling has played in providing new insight and understanding into bacterial chemotaxis. We begin with an overview of the bacterial chemotaxis sensory response, before reviewing the role of theoretical modeling in understanding elements of the system on the single cell scale and features underpinning multiscale extensions to population models. WIREs Syst Biol Med 2012 doi: 10.1002/wsbm.1168 For further resources related to this article, please visit the WIREs website.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An automated cloud band identification procedure is developed that captures the meteorology of such events over southern Africa. This “metbot” is built upon a connected component labelling method that enables blob detection in various atmospheric fields. Outgoing longwave radiation is used to flag candidate cloud band days by thresholding the data and requiring detected blobs to have sufficient latitudinal extent and exhibit positive tilt. The Laplacian operator is used on gridded reanalysis variables to highlight other features of meteorological interest. The ability of this methodology to capture the significant meteorology and rainfall of these synoptic systems is tested in a case study. Usefulness of the metbot in understanding event to event similarities of meteorological features is demonstrated, highlighting features previous studies have noted as key ingredients to cloud band development in the region. Moreover, this allows the presentation of a composite cloud band life cycle for southern Africa events. The potential of metbot to study multiscale interactions is discussed, emphasising its key strength: the ability to retain details of extreme and infrequent events. It automatically builds a database that is ideal for research questions focused on the influence of intraseasonal to interannual variability processes on synoptic events. Application of the method to convergence zone studies and atmospheric river descriptions is suggested. In conclusion, a relation-building metbot can retain details that are often lost with object-based methods but are crucial in case studies. Capturing and summarising these details may be necessary to develop deeper process-level understanding of multiscale interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use new neutron scattering instrumentation to follow in a single quantitative time-resolving experiment, the three key scales of structural development which accompany the crystallisation of synthetic polymers. These length scales span 3 orders of magnitude of the scattering vector. The study of polymer crystallisation dates back to the pioneering experiments of Keller and others who discovered the chain-folded nature of the thin lamellae crystals which are normally found in synthetic polymers. The inherent connectivity of polymers makes their crystallisation a multiscale transformation. Much understanding has developed over the intervening fifty years but the process has remained something of a mystery. There are three key length scales. The chain folded lamellar thickness is ~ 10nm, the crystal unit cell is ~ 1nm and the detail of the chain conformation is ~ 0.1nm. In previous work these length scales have been addressed using different instrumention or were coupled using compromised geometries. More recently researchers have attempted to exploit coupled time-resolved small-angle and wide-angle x-ray experiments. These turned out to be challenging experiments much related to the challenge of placing the scattering intensity on an absolute scale. However, they did stimulate the possibility of new phenomena in the very early stages of crystallisation. Although there is now considerable doubt on such experiments, they drew attention to the basic question as to the process of crystallisation in long chain molecules. We have used NIMROD on the second target station at ISIS to follow all three length scales in a time-resolving manner for poly(e-caprolactone). The technique can provide a single set of data from 0.01 to 100Å-1 on the same vertical scale. We present the results using a multiple scale model of the crystallisation process in polymers to analyse the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the winter of 2013/14, much of the UK experienced repeated intense rainfall events and flooding. This had a considerable impact on property and transport infrastructure. A key question is whether the burning of fossil fuels is changing the frequency of extremes, and if so to what extent. We assess the scale of the winter flooding before reviewing a broad range of Earth system drivers affecting UK rainfall. Some drivers can be potentially disregarded for these specific storms whereas others are likely to have increased their risk of occurrence. We discuss the requirements of hydrological models to transform rainfall into river flows and flooding. To determine any general changing flood risk, we argue that accurate modelling needs to capture evolving understanding of UK rainfall interactions with a broad set of factors. This includes changes to multiscale atmospheric, oceanic, solar and sea-ice features, and land-use and demographics. Ensembles of such model simulations may be needed to build probability distributions of extremes for both pre-industrial and contemporary concentration levels of atmospheric greenhouse gases.