912 resultados para fundamental principles and applications
Resumo:
The sampling of certain solid angle is a fundamental operation in realistic image synthesis, where the rendering equation describing the light propagation in closed domains is solved. Monte Carlo methods for solving the rendering equation use sampling of the solid angle subtended by unit hemisphere or unit sphere in order to perform the numerical integration of the rendering equation. In this work we consider the problem for generation of uniformly distributed random samples over hemisphere and sphere. Our aim is to construct and study the parallel sampling scheme for hemisphere and sphere. First we apply the symmetry property for partitioning of hemisphere and sphere. The domain of solid angle subtended by a hemisphere is divided into a number of equal sub-domains. Each sub-domain represents solid angle subtended by orthogonal spherical triangle with fixed vertices and computable parameters. Then we introduce two new algorithms for sampling of orthogonal spherical triangles. Both algorithms are based on a transformation of the unit square. Similarly to the Arvo's algorithm for sampling of arbitrary spherical triangle the suggested algorithms accommodate the stratified sampling. We derive the necessary transformations for the algorithms. The first sampling algorithm generates a sample by mapping of the unit square onto orthogonal spherical triangle. The second algorithm directly compute the unit radius vector of a sampling point inside to the orthogonal spherical triangle. The sampling of total hemisphere and sphere is performed in parallel for all sub-domains simultaneously by using the symmetry property of partitioning. The applicability of the corresponding parallel sampling scheme for Monte Carlo and Quasi-D/lonte Carlo solving of rendering equation is discussed.
Resumo:
An automatic nonlinear predictive model-construction algorithm is introduced based on forward regression and the predicted-residual-sums-of-squares (PRESS) statistic. The proposed algorithm is based on the fundamental concept of evaluating a model's generalisation capability through crossvalidation. This is achieved by using the PRESS statistic as a cost function to optimise model structure. In particular, the proposed algorithm is developed with the aim of achieving computational efficiency, such that the computational effort, which would usually be extensive in the computation of the PRESS statistic, is reduced or minimised. The computation of PRESS is simplified by avoiding a matrix inversion through the use of the orthogonalisation procedure inherent in forward regression, and is further reduced significantly by the introduction of a forward-recursive formula. Based on the properties of the PRESS statistic, the proposed algorithm can achieve a fully automated procedure without resort to any other validation data set for iterative model evaluation. Numerical examples are used to demonstrate the efficacy of the algorithm.
Resumo:
Focuses on recent advances in research on block copolymers, covering chemistry (synthesis), physics (phase behaviors, rheology, modeling), and applications (melts and solutions). Written by a team of internationally respected scientists from industry and academia, this text compiles and reviews the expanse of research that has taken place over the last five years into one accessible resource. Ian Hamley is the world-leading scientist in the field of block copolymer research Presents the recent advances in the area, covering chemistry, physics and applications. Provides a broad coverage from synthesis to fundamental physics through to applications Examines the potential of block copolymers in nanotechnology as self-assembling soft materials
Resumo:
We propose a Nystr¨om/product integration method for a class of second kind integral equations on the real line which arise in problems of two-dimensional scalar and elastic wave scattering by unbounded surfaces. Stability and convergence of the method is established with convergence rates dependent on the smoothness of components of the kernel. The method is applied to the problem of acoustic scattering by a sound soft one-dimensional surface which is the graph of a function f, and superalgebraic convergence is established in the case when f is infinitely smooth. Numerical results are presented illustrating this behavior for the case when f is periodic (the diffraction grating case). The Nystr¨om method for this problem is stable and convergent uniformly with respect to the period of the grating, in contrast to standard integral equation methods for diffraction gratings which fail at a countable set of grating periods.
Resumo:
Following trends in operational weather forecasting, where ensemble prediction systems (EPS) are now increasingly the norm, flood forecasters are beginning to experiment with using similar ensemble methods. Most of the effort to date has focused on the substantial technical challenges of developing coupled rainfall-runoff systems to represent the full cascade of uncertainties involved in predicting future flooding. As a consequence much less attention has been given to the communication and eventual use of EPS flood forecasts. Drawing on interviews and other research with operational flood forecasters from across Europe, this paper highlights a number of challenges to communicating and using ensemble flood forecasts operationally. It is shown that operational flood forecasters understand the skill, operational limitations, and informational value of EPS products in a variety of different and sometimes contradictory ways. Despite the efforts of forecasting agencies to design effective ways to communicate EPS forecasts to non-experts, operational flood forecasters were often skeptical about the ability of forecast recipients to understand or use them appropriately. It is argued that better training and closer contacts between operational flood forecasters and EPS system designers can help ensure the uncertainty represented by EPS forecasts is represented in ways that are most appropriate and meaningful for their intended consumers, but some fundamental political and institutional challenges to using ensembles, such as differing attitudes to false alarms and to responsibility for management of blame in the event of poor or mistaken forecasts are also highlighted. Copyright © 2010 Royal Meteorological Society.
Resumo:
The need to source live human tissues for research and clinical applications has been a major driving force for the development of new biomaterials. Ideally, these should elicit the formation of scaffold-free tissues with native-like structure and composition. In this study, we describe a biologically interactive coating that combines the fabrication and subsequent self-release of live purposeful tissues using template–cell–environment feedback. This smart coating was formed from a self-assembling peptide amphiphile comprising a proteasecleavable sequence contiguous with a cell attachment and signaling motif. This multifunctional material was subsequently used not only to instruct human corneal or skin fibroblasts to adhere and deposit discreet multiple layers of native extracellular matrix but also to govern their own self-directed release from the template solely through the action of endogenous metalloproteases. Tissues recovered through this physiologically relevant process were carrier-free and structurally and phenotypically equivalent to their natural counterparts. This technology contributes to a new paradigm in regenerative medicine, whereby materials are able to actively direct and respond to cell behavior. The novel application of such materials as a coating capable of directing the formation and detachment of complex tissues solely under physiological conditions can have broad use for fundamental research and in future cell and tissue therapies.
Resumo:
Document engineering is the computer science discipline that investigates systems for documents in any form and in all media. As with the relationship between software engineering and software, document engineering is concerned with principles, tools and processes that improve our ability to create, manage, and maintain documents (http://www.documentengineering.org). The ACM Symposium on Document Engineering is an annual meeting of researchers active in document engineering: it is sponsored by ACM by means of the ACM SIGWEB Special Interest Group. In this editorial, we first point to work carried out in the context of document engineering, which are directly related to multimedia tools and applications. We conclude with a summary of the papers presented in this special issue.
Resumo:
The literature reports research efforts allowing the editing of interactive TV multimedia documents by end-users. In this article we propose complementary contributions relative to end-user generated interactive video, video tagging, and collaboration. In earlier work we proposed the watch-and-comment (WaC) paradigm as the seamless capture of an individual`s comments so that corresponding annotated interactive videos be automatically generated. As a proof of concept, we implemented a prototype application, the WACTOOL, that supports the capture of digital ink and voice comments over individual frames and segments of the video, producing a declarative document that specifies both: different media stream structure and synchronization. In this article, we extend the WaC paradigm in two ways. First, user-video interactions are associated with edit commands and digital ink operations. Second, focusing on collaboration and distribution issues, we employ annotations as simple containers for context information by using them as tags in order to organize, store and distribute information in a P2P-based multimedia capture platform. We highlight the design principles of the watch-and-comment paradigm, and demonstrate related results including the current version of the WACTOOL and its architecture. We also illustrate how an interactive video produced by the WACTOOL can be rendered in an interactive video environment, the Ginga-NCL player, and include results from a preliminary evaluation.
Resumo:
Given two maps h : X x K -> R and g : X -> K such that, for all x is an element of X, h(x, g(x)) = 0, we consider the equilibrium problem of finding (x) over tilde is an element of X such that h((x) over tilde, g(x)) >= 0 for every x is an element of X. This question is related to a coincidence problem.
A bivariate regression model for matched paired survival data: local influence and residual analysis
Resumo:
The use of bivariate distributions plays a fundamental role in survival and reliability studies. In this paper, we consider a location scale model for bivariate survival times based on the proposal of a copula to model the dependence of bivariate survival data. For the proposed model, we consider inferential procedures based on maximum likelihood. Gains in efficiency from bivariate models are also examined in the censored data setting. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the bivariate regression model for matched paired survival data. Sensitivity analysis methods such as local and total influence are presented and derived under three perturbation schemes. The martingale marginal and the deviance marginal residual measures are used to check the adequacy of the model. Furthermore, we propose a new measure which we call modified deviance component residual. The methodology in the paper is illustrated on a lifetime data set for kidney patients.
Resumo:
We analyze the stability properties of equilibrium solutions and periodicity of orbits in a two-dimensional dynamical system whose orbits mimic the evolution of the price of an asset and the excess demand for that asset. The construction of the system is grounded upon a heterogeneous interacting agent model for a single risky asset market. An advantage of this construction procedure is that the resulting dynamical system becomes a macroscopic market model which mirrors the market quantities and qualities that would typically be taken into account solely at the microscopic level of modeling. The system`s parameters correspond to: (a) the proportion of speculators in a market; (b) the traders` speculative trend; (c) the degree of heterogeneity of idiosyncratic evaluations of the market agents with respect to the asset`s fundamental value; and (d) the strength of the feedback of the population excess demand on the asset price update increment. This correspondence allows us to employ our results in order to infer plausible causes for the emergence of price and demand fluctuations in a real asset market. The employment of dynamical systems for studying evolution of stochastic models of socio-economic phenomena is quite usual in the area of heterogeneous interacting agent models. However, in the vast majority of the cases present in the literature, these dynamical systems are one-dimensional. Our work is among the few in the area that construct and study analytically a two-dimensional dynamical system and apply it for explanation of socio-economic phenomena.