54 resultados para Illumination subspace
Resumo:
Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.
Resumo:
From geostationary satellite observations of equatorial Africa and the equatorial east Atlantic during May and June 2000 we explore the radiative forcing by deep convective cloud systems in these regions. Deep convective clouds (DCCs) are associated with a mean radiative forcing relative to non–deep convective areas of −39 W m−2 over the Atlantic Ocean and of +13 W m−2 over equatorial Africa (±10 W m−2 in both cases). We show that over land the timing of the daily cycle of convection relative to the daily cycle in solar illumination and surface temperature significantly affects the mean radiative forcing by DCCs. Displacement of the daily cycle of DCC coverage by 2 hours changes their overall radiative effect by ∼10 W m−2, with implications for the simulation of the radiative balance in this region. The timing of the minimum DCC cover over land, close to noon local time, means that the mean radiative forcing is nearly maximized.
Resumo:
Resetting of previously accumulated optically stimulated luminescence (OSL) signals during transport of sediment is a fundamental requirement for reliable optical dating. The completeness of optical resetting of 46 modern-age quartz samples from a variety of depositional environments was examined. All equivalent dose (De) estimates were View the MathML source, with the majority of aeolian samples View the MathML source, and fluvial samples View the MathML source. The OSL signal of quartz originates from several trap types with different rates of charge loss during illumination. As such, incomplete bleaching may be identifiable as an increase in De from easy-to-bleach through to hard-to-bleach components. For all modern fluvial samples with non-zero De values, SAR De(t) analysis and component-resolved linearly modulated OSL (LM OSL) De estimates showed this to be the case, implying incomplete resetting of previously accumulated charge. LM OSL measurements were also made to investigate the extent of bleaching of the slow components in the natural environment. In aeolian sediments examined, the natural LM OSL was effectively zero (i.e. all components were fully reset). The slow components of modern fluvial samples displayed measurable residual signals up to 15 Gy.
Resumo:
We propose a new class of neurofuzzy construction algorithms with the aim of maximizing generalization capability specifically for imbalanced data classification problems based on leave-one-out (LOO) cross validation. The algorithms are in two stages, first an initial rule base is constructed based on estimating the Gaussian mixture model with analysis of variance decomposition from input data; the second stage carries out the joint weighted least squares parameter estimation and rule selection using orthogonal forward subspace selection (OFSS)procedure. We show how different LOO based rule selection criteria can be incorporated with OFSS, and advocate either maximizing the leave-one-out area under curve of the receiver operating characteristics, or maximizing the leave-one-out Fmeasure if the data sets exhibit imbalanced class distribution. Extensive comparative simulations illustrate the effectiveness of the proposed algorithms.
Resumo:
A disposable backscatter instrument is described for optical detection of cloud in the atmosphere from a balloon-carried platform. It uses an ultra-bright light emitting diode (LED) illumination source with a photodiode detector. Scattering of the LED light by cloud droplets generates a small optical signal which is separated from background light fluctuations using a lock-in technique. The signal to noise obtained permits cloud detection using the scattered LED light, even in daytime. The response is interpreted in terms of the equivalent visual range within the cloud. The device is lightweight (150 g) and low power (∼30 mA), for use alongside a conventional meteorological radiosonde.
Resumo:
Cell membranes are composed of two-dimensional bilayers of amphipathic lipids, which allow a lateral movement of the respective membrane components. These components are arranged in an inhomogeneous manner as transient micro- and nanodomains, which are believed to be crucially involved in the regulation of signal transduction pathways in mammalian cells. Because of their small size (diameter 10-200 nm), membrane nanodomains cannot be directly imaged using conventional light microscopy. Here, we present direct visualization of cell membrane nanodomains by helium ion microscopy (HIM). We show that HIM is capable to image biological specimens without any conductive coating, and that HIM images clearly allow the identification of nanodomains in the ultrastructure of membranes with 1.5 nm resolution. The shape of these nanodomains is preserved by fixation of the surrounding unsaturated fatty acids while saturated fatty acids inside the nanodomains are selectively removed. Atomic force microscopy, fluorescence microscopy, 3D structured illumination microscopy and direct stochastic optical reconstruction microscopy provide additional evidence that the structures in the HIM images of cell membranes originate from membrane nanodomains. The nanodomains observed by HIM have an average diameter of 20 nm and are densely arranged with a minimal nearest neighbor distance of ~15 nm.
Resumo:
Proponents of physical intentionality argue that the classic hallmarks of intentionality highlighted by Brentano are also found in purely physical powers. Critics worry that this idea is metaphysically obscure at best, and at worst leads to panpsychism or animism. I examine the debate in detail, finding both confusion and illumination in the physical intentionalist thesis. Analysing a number of the canonical features of intentionality, I show that they all point to one overarching phenomenon of which both the mental and the physical are kinds, namely finality. This is the finality of ‘final causes’, the long-discarded idea of universal action for an end to which recent proponents of physical intentionality are in fact pointing whether or not they realise it. I explain finality in terms of the concept of specific indifference, arguing that in the case of the mental, specific indifference is realised by the process of abstraction, which has no correlate in the case of physical powers. This analysis, I conclude, reveals both the strength and weakness of rational creatures such as us, as well as demystifying (albeit only partly) the way in which powers work.
Resumo:
Tensor clustering is an important tool that exploits intrinsically rich structures in real-world multiarray or Tensor datasets. Often in dealing with those datasets, standard practice is to use subspace clustering that is based on vectorizing multiarray data. However, vectorization of tensorial data does not exploit complete structure information. In this paper, we propose a subspace clustering algorithm without adopting any vectorization process. Our approach is based on a novel heterogeneous Tucker decomposition model taking into account cluster membership information. We propose a new clustering algorithm that alternates between different modes of the proposed heterogeneous tensor model. All but the last mode have closed-form updates. Updating the last mode reduces to optimizing over the multinomial manifold for which we investigate second order Riemannian geometry and propose a trust-region algorithm. Numerical experiments show that our proposed algorithm compete effectively with state-of-the-art clustering algorithms that are based on tensor factorization.
Resumo:
For the last few years, I have been working on an extensive digital model of ancient Rome as it appeared in the early 4th Century AD. This sort of visualisation lends itself to many applications in diverse fields: I am currently using it for research work into illumination and sightlines in the ancient city, have licensed it for broadcast in TV documentaries and publication in magazines, and am working with a computer games studio to turn it into an online game where players will be able to walk round the streets and buildings of the entire city (when not engaged in trading with or assassinating one another). Later this year I will be making a free online course, or MOOC, about the architecture of ancient Rome, which will largely be illustrated by this model.