880 resultados para one-to-many mapping


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concept mapping involves determining relevant concepts from a free-text input, where concepts are defined in an external reference ontology. This is an important process that underpins many applications for clinical information reporting, derivation of phenotypic descriptions, and a number of state-of-the-art medical information retrieval methods. Concept mapping can be cast into an information retrieval (IR) problem: free-text mentions are treated as queries and concepts from a reference ontology as the documents to be indexed and retrieved. This paper presents an empirical investigation applying general-purpose IR techniques for concept mapping in the medical domain. A dataset used for evaluating medical information extraction is adapted to measure the effectiveness of the considered IR approaches. Standard IR approaches used here are contrasted with the effectiveness of two established benchmark methods specifically developed for medical concept mapping. The empirical findings show that the IR approaches are comparable with one benchmark method but well below the best benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many-electron systems confined to a quasi-one-dimensional geometry by a cylindrical distribution of positive charge have been investigated by density functional computations in the unrestricted local spin density approximation. Our investigations have been focused on the low-density regime, in which electrons are localized. The results reveal a wide variety of different charge and spin configurations, including linear and zig-zag chains, single-and double-strand helices, and twisted chains of dimers. The spin-spin coupling turns from weakly antiferromagnetic at relatively high density, to weakly ferromagnetic at the lowest densities considered in our computations. The stability of linear chains of localized charge has been investigated by analyzing the radial dependence of the self-consistent potential and by computing the dispersion relation of low-energy harmonic excitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Amphotericin B acts through pore formation at the cell membrane after binding to ergosterol is an accepted dogma about the action mechanism of this antifungal, and this sentence is widely found in the literature. But after 60 years of investigation, the action mechanism of Amphotericin B is not fully elucidated. Amphotericin B is a polyene substance that is one of the most effective drugs for the treatment of fungal and parasite infections. As stated above, the first mechanism of action described was pore formation after binding to the ergosterol present in the membrane. But it has also been demonstrated that AmB induces oxidative damage in the cells. Moreover, amphotericin B modulates the immune system, and this activity has been related to the protective effect of the molecule, but also to its toxicity in the host. This review tries to provide a general overview of the main aspects of this molecule, and highlight the multiple effects that this molecule has on both the fungal and host cells. © 2012 Mesa-Arango, Scorzoni and Zaragoza.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* The work is partly supported by RFFI grant 08-07-00062-a

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Volcanic eruption centres of the mostly 4.5 Ma-5000 BP Newer Volcanics Province in the Hamilton area of southeastern Australia were examined in detail using a multifaceted approach, including ground truthing and analysis of ArcGIS Total Magnetic Intensity and seamless geology data, NASA Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) digital elevation models and Google Earth satellite image interpretation. Sixteen eruption centres were recognised in the Hamilton area, including three previously unrecorded volcanoes-one of which, the Cas Maar, constitutes the northernmost maar-cone volcanic complex in the Western Plains subprovince. Seven previously allocated eruption centres were placed into question based on field and laboratory observations. Three phases of volcanic activity have been suggested by other authors and are interpreted to correlate with ages of >4 Ma, ca 2 Ma and <0.5 Ma, which may be further subdivided based on preservation of outcrop. Geochemical compositions of the dominantly basaltic products become increasingly alkaline and enriched in incompatible elements from Phases 1 to 2, with Phase 3 eruptions both covering the entire geochemical range and extending into increasingly enriched compositions. This research highlights the importance of a multifaceted approach to landform mapping and demonstrates that additional volcanic centres may yet be discovered in the Newer Volcanics Province

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of one enzyme-one activity had influenced biochemistry for over half a century. Over 1000 enzymes are now described. Many of them are highly 'specific'. Some of them are crystallized and their three-dimensional structures determined. They range from 12 to 1000 kDa in molecular weight and possess 124 to several hundreds of amino acids. They occur as single polypeptides or multiple-subunit proteins. The active sites are assembled on these by appropriate tertiary folding of the polypeptide chain, or by interaction of the constituent subunits. The substrate is held by the side-chains of a few amino acids at the active site on the surface, occupying a tiny fraction of the total area. What is the bulk of the protein behind the active site doing? Do all proteins have only one function each? Why not a protein have more than one active site on its large surface? Will we discover more than one activity for some proteins? These newer possibilities are emerging and are finding experimental support. Some proteins purified to homogeneity using assay methods for different activities are now recognized to have the same molecular weight and a high degree of homology of amino acid sequence. Obviously they are identical. They represent the phenomenon of one protein-many functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mineral exploration programmes around the world use data from remote sensing, geophysics and direct sampling. On a regional scale, the combination of airborne geophysics and ground-based geochemical sampling can aid geological mapping and economic minerals exploration. The fact that airborne geophysical and traditional soil-sampling data are generated at different spatial resolutions means that they are not immediately comparable due to their different sampling density. Several geostatistical techniques, including indicator cokriging and collocated cokriging, can be used to integrate different types of data into a geostatistical model. With increasing numbers of variables the inference of the cross-covariance model required for cokriging can be demanding in terms of effort and computational time. In this paper a Gaussian-based Bayesian updating approach is applied to integrate airborne radiometric data and ground-sampled geochemical soil data to maximise information generated from the soil survey, to enable more accurate geological interpretation for the exploration and development of natural resources. The Bayesian updating technique decomposes the collocated estimate into a production of two models: prior and likelihood models. The prior model is built from primary information and the likelihood model is built from secondary information. The prior model is then updated with the likelihood model to build the final model. The approach allows multiple secondary variables to be simultaneously integrated into the mapping of the primary variable. The Bayesian updating approach is demonstrated using a case study from Northern Ireland where the history of mineral prospecting for precious and base metals dates from the 18th century. Vein-hosted, strata-bound and volcanogenic occurrences of mineralisation are found. The geostatistical technique was used to improve the resolution of soil geochemistry, collected one sample per 2 km2, by integrating more closely measured airborne geophysical data from the GSNI Tellus Survey, measured over a footprint of 65 x 200 m. The directly measured geochemistry data were considered as primary data in the Bayesian approach and the airborne radiometric data were used as secondary data. The approach produced more detailed updated maps and in particular maximized information on mapped estimates of zinc, copper and lead. Greater delineation of an elongated northwest/southeast trending zone in the updated maps strengthened the potential to investigate stratabound base metal deposits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In elections, majority divisions pave the way to focal manipulations and coordination failures, which can lead to the victory of the wrong candidate. This paper shows how this flaw can be addressed if voter preferences over candidates are sensitive to information. We consider two potential sources of divisions: majority voters may have similar preferences but opposite information about the candidates, or opposite preferences. We show that when information is the source of majority divisions, Approval Voting features a unique equilibrium with full information and coordination equivalence. That is, it produces the same outcome as if both information and coordination problems could be resolved. Other electoral systems, such as Plurality and Two-Round elections, do not satisfy this equivalence. The second source of division is opposite preferences. Whenever the fraction of voters with such preferences is not too large, Approval Voting still satisfies full information and coordination equivalence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the significant recent growth in research relating to instrumental, vocal and composition tuition in higher education, little is known about the diversity of approaches that characterise one-to-one teaching in the Conservatoire, and what counts as optimal practice for educating 21st-century musicians. Through analysis of video-recorded one-to-one lessons that draws on a ‘bottom up’ methodology for characterising pedagogical practices (Taylor, 2012; Taylor et al, 2012), this paper provides empirical evidence about the nature of one-to-one pedagogy in one Australian institution. The research aims (1) to enable a better understanding of current one-to-one conservatoire teaching; and (2) to build and improve upon existing teaching practice using authentic insights gained through systematic investigation. The authors hope the research will lead to a better understanding of the diversity and efficacy of the pedagogical practice within the specific context in which the study was conducted, and beyond, to Conservatoire pedagogy generally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite significant investment in school one-to-one device programs, little is known about which aspects of program implementation work and why. Through a comparison of two implementation models, adopter-diffusion and saturation, and using existing data from the One Laptop per Child Australia laptop program, we explored how factors of implementation may affect device diffusion, learning and educational outcomes, and program sustainability in schools. In this article we argue that more focused research into implementation of one-to-one device programs, moving beyond comparisons of “devices versus without devices,” is needed to provide reliable data to inform future program funding and advance this area of research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A single-step solid-phase RIA (SS-SPRIA) developed in our laboratory using hybridoma culture supernatants has been utilised for the quantitation of epitope-paratope interactions. Using SS-SPRIA as a quantitative tool for the assessment of epitope stability, it was found that several assembled epitopes of human chorionic gonadotropin (hCG) are differentially stable to proteolysis and chemical modification. Based on these observations an approach has been developed for identifying the amino acid residues constituting an epitopic region. This approach has now been used to map an assembled epitope at/near the receptor binding region of the hormone. The mapped site forms a part of the seat belt region and the cystine knot region (C34-C38-C88-C90-H106). The carboxy terminal region of the alpha-subunit forms a part of the epitope indicating its proximity to the receptor binding region. These results are in agreement with the reported receptor binding region identified through other approaches and the X-ray crystal structure of hCG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.