938 resultados para slots (or shelf space)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The MQN-mapplet is a Java application giving access to the structure of small molecules in large databases via color-coded maps of their chemical space. These maps are projections from a 42-dimensional property space defined by 42 integer value descriptors called molecular quantum numbers (MQN), which count different categories of atoms, bonds, polar groups, and topological features and categorize molecules by size, rigidity, and polarity. Despite its simplicity, MQN-space is relevant to biological activities. The MQN-mapplet allows localization of any molecule on the color-coded images, visualization of the molecules, and identification of analogs as neighbors on the MQN-map or in the original 42-dimensional MQN-space. No query molecule is necessary to start the exploration, which may be particularly attractive for nonchemists. To our knowledge, this type of interactive exploration tool is unprecedented for very large databases such as PubChem and GDB-13 (almost one billion molecules). The application is freely available for download at www.gdb.unibe.ch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensity non-uniformity (bias field) correction, contextual constraints over spatial intensity distribution and non-spherical cluster's shape in the feature space are incorporated into the fuzzy c-means (FCM) for segmentation of three-dimensional multi-spectral MR images. The bias field is modeled by a linear combination of smooth polynomial basis functions for fast computation in the clustering iterations. Regularization terms for the neighborhood continuity of either intensity or membership are added into the FCM cost functions. Since the feature space is not isotropic, distance measures, other than the Euclidean distance, are used to account for the shape and volumetric effects of clusters in the feature space. The performance of segmentation is improved by combining the adaptive FCM scheme with the criteria used in Gustafson-Kessel (G-K) and Gath-Geva (G-G) algorithms through the inclusion of the cluster scatter measure. The performance of this integrated approach is quantitatively evaluated on normal MR brain images using the similarity measures. The improvement in the quality of segmentation obtained with our method is also demonstrated by comparing our results with those produced by FSL (FMRIB Software Library), a software package that is commonly used for tissue classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present new interpretations of deglaciation in McMurdo Sound and the western Ross Sea, with observationally based reconstructions of interactions between East and West Antarctic ice at the last glacial maximum (LGM), 16 000, 12 000, 8000 and 4000 sp. At the LGM? East Antarctic ice from Mulock Glacier split, one branch turned westward south of Ross Island but the other branch rounded Ross Island before flowing southwest into McMurdo Sound. This flow regime, constrained by an ice saddle north of Ross Island, is consistent with the reconstruction of Stuiver and others (1981a). After the LGM, grounding-line retreat was most rapid in areas with greatest water depth, especially along the Victoria Land coast. By 12 000 sp, the ice-now regime in McMurdo Sound changed to through-flowing Mulock Glacier ice, with lesser contributions from Koettlitz, Blue and Ferrar Glaciers, because the former ice saddle north of Ross Island was replaced by a dome. The modern flew regime was established similar to 4000 BP. Ice derived from high elevations on the Polar Plateau but now stranded on the McMurdo Ice Shelf, and the pattern of the Transantarctic Mountains erratics support our reconstructions of Mulock Glacier ice rounding Minna Bluff but with all ice from Skelton Glacier ablating south of the bluff. They are inconsistent with Drewry's (1979) LGM reconstruction that includes Skelton Glacier ice in the McMurdo-Sound through-flow. Drewry's (1979) model closely approximates our results for 12 000-4000 BP. Ice-sheet modeling holds promise for determining whether deglaciation proceeded by grounding-line retreat of an ice sheet that was largely stagnant, because it never approached equilibrium flowline profiles after the Ross Ice Shelf, grounded, or of a dynamic ice sheet with flowline profiles kept low by active ice streams that extended northward from present-day outlet glaciers after the Ross Ice Shelf grounded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CMOS-sensors, or in general Active Pixel Sensors (APS), are rapidly replacing CCDs in the consumer camera market. Due to significant technological advances during the past years these devices start to compete with CCDs also for demanding scientific imaging applications, in particular in the astronomy community. CMOS detectors offer a series of inherent advantages compared to CCDs, due to the structure of their basic pixel cells, which each contains their own amplifier and readout electronics. The most prominent advantages for space object observations are the extremely fast and flexible readout capabilities, feasibility for electronic shuttering and precise epoch registration,and the potential to perform image processing operations on-chip and in real-time. Here, the major challenges and design drivers for ground-based and space-based optical observation strategies for objects in Earth orbit have been analyzed. CMOS detector characteristics were critically evaluated and compared with the established CCD technology, especially with respect to the above mentioned observations. Finally, we simulated several observation scenarios for ground- and space-based sensor by assuming different observation and sensor properties. We will introduce the analyzed end-to-end simulations of the ground- and spacebased strategies in order to investigate the orbit determination accuracy and its sensitivity which may result from different values for the frame-rate, pixel scale, astrometric and epoch registration accuracies. Two cases were simulated, a survey assuming a ground-based sensor to observe objects in LEO for surveillance applications, and a statistical survey with a space-based sensor orbiting in LEO observing small-size debris in LEO. The ground-based LEO survey uses a dynamical fence close to the Earth shadow a few hours after sunset. For the space-based scenario a sensor in a sun-synchronous LEO orbit, always pointing in the anti-sun direction to achieve optimum illumination conditions for small LEO debris was simulated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, we develop an adaptive framework for Monte Carlo rendering, and more specifically for Monte Carlo Path Tracing (MCPT) and its derivatives. MCPT is attractive because it can handle a wide variety of light transport effects, such as depth of field, motion blur, indirect illumination, participating media, and others, in an elegant and unified framework. However, MCPT is a sampling-based approach, and is only guaranteed to converge in the limit, as the sampling rate grows to infinity. At finite sampling rates, MCPT renderings are often plagued by noise artifacts that can be visually distracting. The adaptive framework developed in this thesis leverages two core strategies to address noise artifacts in renderings: adaptive sampling and adaptive reconstruction. Adaptive sampling consists in increasing the sampling rate on a per pixel basis, to ensure that each pixel value is below a predefined error threshold. Adaptive reconstruction leverages the available samples on a per pixel basis, in an attempt to have an optimal trade-off between minimizing the residual noise artifacts and preserving the edges in the image. In our framework, we greedily minimize the relative Mean Squared Error (rMSE) of the rendering by iterating over sampling and reconstruction steps. Given an initial set of samples, the reconstruction step aims at producing the rendering with the lowest rMSE on a per pixel basis, and the next sampling step then further reduces the rMSE by distributing additional samples according to the magnitude of the residual rMSE of the reconstruction. This iterative approach tightly couples the adaptive sampling and adaptive reconstruction strategies, by ensuring that we only sample densely regions of the image where adaptive reconstruction cannot properly resolve the noise. In a first implementation of our framework, we demonstrate the usefulness of our greedy error minimization using a simple reconstruction scheme leveraging a filterbank of isotropic Gaussian filters. In a second implementation, we integrate a powerful edge aware filter that can adapt to the anisotropy of the image. Finally, in a third implementation, we leverage auxiliary feature buffers that encode scene information (such as surface normals, position, or texture), to improve the robustness of the reconstruction in the presence of strong noise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flavour of foods is determined by the interaction of taste molecules with receptors in the mouth, and fragrances or aroma with receptors in the upper part of the nose. Here, we discuss the properties of taste and fragrance molecules, from the public databases Superscent, Flavornet, SuperSweet and BitterDB, taken collectively as flavours, in the perspective of the chemical space. We survey simple descriptor profiles in comparison with the public collections ChEMBL (bioactive small molecules), ZINC (commercial drug-like molecules) and GDB-13 (all possible organic molecules up to 13 atoms of C, N, O, S, Cl). A global analysis of the chemical space of flavours is also presented based on molecular quantum numbers (MQN) and SMILES fingerprints (SMIfp). While taste molecules span a very broad property range, fragrances occupy a narrow area of the chemical space consisting of generally very small and relatively nonpolar molecules distinct of standard drug molecules. Proximity searching in the chemical space is exemplified as a simple method to facilitate the search for new fragrances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In several of his writings, Isaac Newton proposed that physical space is God’s “emanative effect” or “sensorium,” revealing something interesting about the metaphysics underlying his mathematical physics. Newton’s conjectures depart from Plato and Aristotle’s metaphysics of space and from classical and Cambridge Neoplatonism. Present-day philosophical concepts of supervenience clarify Newton’s ideas about space and offer a portrait of Newton not only as a mathematical physicist but an independent-minded rationalist philosopher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Policy brokers and policy entrepreneurs are assumed to have a decisive impact on policy outcomes. Their access to social and political resources is contingent on their influence on other agents. In social network analysis (SNA), entrepreneurs are often closely associated with brokers, because both are agents presumed to benefit from bridging structural holes; for example, gaining advantage through occupying a strategic position in relational space. Our aim here is twofold. First, to conceptually and operationally differentiate policy brokers from policy entrepreneurs premised on assumptions in the policy-process literature; and second, via SNA, to use the output of core algorithms in a cross-sectional analysis of political brokerage and political entrepreneurship. We attempt to simplify the use of graph algebra in answering questions relevant to policy analysis by placing each algorithm within its theoretical context. In the methodology employed, we first identify actors and graph their relations of influence within a specific policy event; then we select the most central actors; and compare their rank in a series of statistics that capture different aspects of their network advantage. We examine betweenness centrality, positive and negative Bonacich power, Burt’s effective size and constraint and honest brokerage as paradigmatic. We employ two case studies to demonstrate the advantages and limitations of each algorithm for differentiating between brokers and entrepreneurs: one on Swiss climate policy and one on EU competition and transport policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results of a search for supersymmetry via direct production of third-generation squarks are reported, using 20.3  fb −1 of proton-proton collision data at √s =8  TeV recorded by the ATLAS experiment at the LHC in 2012. Two different analysis strategies based on monojetlike and c -tagged event selections are carried out to optimize the sensitivity for direct top squark-pair production in the decay channel to a charm quark and the lightest neutralino (t 1 →c+χ ˜ 0 1 ) across the top squark–neutralino mass parameter space. No excess above the Standard Model background expectation is observed. The results are interpreted in the context of direct pair production of top squarks and presented in terms of exclusion limits in the m ˜t 1, m ˜ X0 1 ) parameter space. A top squark of mass up to about 240 GeV is excluded at 95% confidence level for arbitrary neutralino masses, within the kinematic boundaries. Top squark masses up to 270 GeV are excluded for a neutralino mass of 200 GeV. In a scenario where the top squark and the lightest neutralino are nearly degenerate in mass, top squark masses up to 260 GeV are excluded. The results from the monojetlike analysis are also interpreted in terms of compressed scenarios for top squark-pair production in the decay channel t ˜ 1 →b+ff ′ +χ ˜ 0 1 and sbottom pair production with b ˜ 1 →b+χ ˜ 0 1 , leading to a similar exclusion for nearly mass-degenerate third-generation squarks and the lightest neutralino. The results in this paper significantly extend previous results at colliders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter aims to overcome the gap existing between case study research, which typically provides qualitative and process-based insights, and national or global inventories that typically offer spatially explicit and quantitative analysis of broader patterns, and thus to present adequate evidence for policymaking regarding large-scale land acquisitions. Therefore, the chapter links spatial patterns of land acquisitions to underlying implementation processes of land allocation. Methodologically linking the described patterns and processes proved difficult, but we have identified indicators that could be added to inventories and monitoring systems to make linkage possible. Combining complementary approaches in this way may help to determine where policy space exists for more sustainable governance of land acquisitions, both geographically and with regard to processes of agrarian transitions. Our spatial analysis revealed two general patterns: (i) relatively large forestry-related acquisitions that target forested landscapes and often interfere with semi-subsistence farming systems; and (ii) smaller agriculture-related acquisitions that often target existing cropland and also interfere with semi-subsistence systems. Furthermore, our meta-analysis of land acquisition implementation processes shows that authoritarian, top-down processes dominate. Initially, the demands of powerful regional and domestic investors tend to override socio-ecological variables, local actors’ interests, and land governance mechanisms. As available land grows scarce, however, and local actors gain experience dealing with land acquisitions, it appears that land investments begin to fail or give way to more inclusive, bottom-up investment models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The currently proposed space debris remediation measures include the active removal of large objects and “just in time” collision avoidance by deviating the objects using, e.g., ground-based lasers. Both techniques require precise knowledge of the attitude state and state changes of the target objects. In the former case, to devise methods to grapple the target by a tug spacecraft, in the latter, to precisely propagate the orbits of potential collision partners as disturbing forces like air drag and solar radiation pressure depend on the attitude of the objects. Non-resolving optical observations of the magnitude variations, so-called light curves, are a promising technique to determine rotation or tumbling rates and the orientations of the actual rotation axis of objects, as well as their temporal changes. The 1-meter telescope ZIMLAT of the Astronomical Institute of the University of Bern has been used to collect light curves of MEO and GEO objects for a considerable period of time. Recently, light curves of Low Earth Orbit (LEO) targets were acquired as well. We present different observation methods, including active tracking using a CCD subframe readout technique, and the use of a high-speed scientific CMOS camera. Technical challenges when tracking objects with poor orbit redictions, as well as different data reduction methods are addressed. Results from a survey of abandoned rocket upper stages in LEO, examples of abandoned payloads and observations of high area-to-mass ratio debris will be resented. Eventually, first results of the analysis of these light curves are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the consequences of one extra spatial dimension for the stability and energy spectrum of the non-relativistic hydrogen atom with a potential defined by Gauss' law, i.e. proportional to 1 /| x | 2 . The additional spatial dimension is considered to be either infinite or curled-up in a circle of radius R. In both cases, the energy spectrum is bounded from below for charges smaller than the same critical value and unbounded from below otherwise. As a consequence of compactification, negative energy eigenstates appear: if R is smaller than a quarter of the Bohr radius, the corresponding Hamiltonian possesses an infinite number of bound states with minimal energy extending at least to the ground state of the hydrogen atom.