58 resultados para automatically generated meta classifiers with large levels


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large volume of visual content is inaccessible until effective and efficient indexing and retrieval of such data is achieved. In this paper, we introduce the DREAM system, which is a knowledge-assisted semantic-driven context-aware visual information retrieval system applied in the film post production domain. We mainly focus on the automatic labelling and topic map related aspects of the framework. The use of the context- related collateral knowledge, represented by a novel probabilistic based visual keyword co-occurrence matrix, had been proven effective via the experiments conducted during system evaluation. The automatically generated semantic labels were fed into the Topic Map Engine which can automatically construct ontological networks using Topic Maps technology, which dramatically enhances the indexing and retrieval performance of the system towards an even higher semantic level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a new method for reconstructing 3D surface points and a wireframe on the surface of a freeform object using a small number, e.g. 10, of 2D photographic images. The images are taken at different viewing directions by a perspective camera with full prior knowledge of the camera configurations. The reconstructed surface points are frontier points and the wireframe is a network of contour generators. Both of them are reconstructed by pairing apparent contours in the 2D images. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The unique pattern of the reconstructed points and contours may be used in 31) object recognition and measurement without computationally intensive full surface reconstruction. The results are obtained from both computer-generated and real objects. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a method for reconstructing 3D frontier points, contour generators and surfaces of anatomical objects or smooth surfaces from a small number, e. g. 10, of conventional 2D X-ray images. The X-ray images are taken at different viewing directions with full prior knowledge of the X-ray source and sensor configurations. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the number of viewing directions is fixed and the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The technique may be used not only in medicine but also in industrial applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 1967 a novel scheme was proposed for controlling processes with large pure time delay (Fellgett et al, 1967) and some of the constituent parts of the scheme were investigated (Swann, 1970; Atkinson et al, 1973). At that time the available computational facilities were inadequate for the scheme to be implemented practically, but with the advent of modern microcomputers the scheme becomes feasible. This paper describes recent work (Mitchell, 1987) in implementing the scheme in a new multi-microprocessor configuration and shows the improved performance it provides compared with conventional three-term controllers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Retinal blurring resulting from the human eye's depth of focus has been shown to assist visual perception. Infinite focal depth within stereoscopically displayed virtual environments may cause undesirable effects, for instance, objects positioned at a distance in front of or behind the observer's fixation point will be perceived in sharp focus with large disparities thereby causing diplopia. Although published research on incorporation of synthetically generated Depth of Field (DoF) suggests that this might act as an enhancement to perceived image quality, no quantitative testimonies of perceptional performance gains exist. This may be due to the difficulty of dynamic generation of synthetic DoF where focal distance is actively linked to fixation distance. In this paper, such a system is described. A desktop stereographic display is used to project a virtual scene in which synthetically generated DoF is actively controlled from vergence-derived distance. A performance evaluation experiment on this system which involved subjects carrying out observations in a spatially complex virtual environment was undertaken. The virtual environment consisted of components interconnected by pipes on a distractive background. The subject was tasked with making an observation based on the connectivity of the components. The effects of focal depth variation in static and actively controlled focal distance conditions were investigated. The results and analysis are presented which show that performance gains may be achieved by addition of synthetic DoF. The merits of the application of synthetic DoF are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parasitic infections cause a myriad of responses in their mammalian hosts, on immune as well as on metabolic level. A multiplex panel of cytokines and metabolites derived from four parasite-rodent models, namely, Plasmodium berghei-mouse, Trypanosoma brucei brucei-mouse, Schistosoma mansoni-mouse, and Fasciola hepatica-rat were statistically coanalyzed. 1H NMR spectroscopy and multivariate statistical analysis were used to characterize the urine and plasma metabolite profiles in infected and noninfected animals. Each parasite generated a unique metabolic signature in the host. Plasma cytokine concentrations were obtained using the ‘Meso Scale Discovery’ multi cytokine assay platform. Multivariate data integration methods were subsequently used to elucidate the component of the metabolic signature which is associated with inflammation and to determine specific metabolic correlates with parasite-induced changes in plasma cytokine levels. For example, the relative levels of acetyl glycoproteins extracted from the plasma metabolite profile in the P. berghei-infected mice were statistically correlated with IFN-γ, whereas the same cytokine was anticorrelated with glucose levels. Both the metabolic and the cytokine data showed a similar spatial distribution in principal component analysis scores plots constructed for the combined murine data, with samples from all infected animals clustering according to the parasite species and whereby the protozoan infections (P. berghei and T. b. brucei) grouped separately from the helminth infection (S. mansoni). For S. mansoni, the main infection-responsive cytokines were IL-4 and IL-5, which covaried with lactate, choline, and D-3-hydroxybutyrate. This study demonstrates that the inherently differential immune response to single and multicellular parasites not only manifests in the cytokine expression, but also consequently imprints on the metabolic signature, and calls for in-depth analysis to further explore direct links between immune features and biochemical pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes hybrid capital securities as a significant part of senior bank executive incentive compensation in light of Basel III, a new global regulatory standard on bank capital adequacy and liquidity agreed by the members of the Basel Committee on Banking Supervision. The committee developed Basel III in a response to the deficiencies in financial regulation brought about by the global financial crisis. Basel III strengthens bank capital requirements and introduces new regulatory requirements on bank liquidity and bank leverage. The hybrid bank capital securities we propose for bank executives’ compensation are preferred shares and subordinated debt that the June 2004 Basel II regulatory framework recognised as other admissible forms of capital. The past two decades have witnessed dramatic increase in performance-related pay in the banking industry. Stakeholders such as shareholders, debtholders and regulators criticise traditional cash and equity-based compensation for encouraging bank executives’ excessive risk taking and short-termism, which has resulted in the failure of risk management in high profile banks during the global financial crisis. Paying compensation in the form of hybrid bank capital securities may align the interests of executives with those of stakeholders and help banks regain their reputation for prudence after years of aggressive risk-taking. Additionally, banks are desperately seeking to raise capital in order to bolster balance sheets damaged by the ongoing credit crisis. Tapping their own senior employees with large incentive compensation packages may be a viable additional source of capital that is politically acceptable in times of large-scale bailouts of the financial sector and economically wise as it aligns the interests of the executives with the need for a stable financial system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gaining public acceptance is one of the main issues with large-scale low-carbon projects such as hydropower development. It has been recommended by the World Commission on Dams that to gain public acceptance, publicinvolvement is necessary in the decision-making process (WCD, 2000). As financially-significant actors in the planning and implementation of large-scale hydropowerprojects in developing country contexts, the paper examines the ways in which publicinvolvement may be influenced by international financial institutions. Using the casestudy of the NamTheun2HydropowerProject in Laos, the paper analyses how publicinvolvement facilitated by the Asian Development Bank had a bearing on procedural and distributional justice. The paper analyses the extent of publicparticipation and the assessment of full social and environmental costs of the project in the Cost-Benefit Analysis conducted during the projectappraisal stage. It is argued that while efforts were made to involve the public, there were several factors that influenced procedural and distributional justice: the late contribution of the Asian Development Bank in the projectappraisal stage; and the issue of non-market values and discount rate to calculate the full social and environmental costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate simulations by 16 atmospheric general circulation models (AGCMs) are compared on an aqua-planet, a water-covered Earth with prescribed sea surface temperature varying only in latitude. The idealised configuration is designed to expose differences in the circulation simulated by different models. Basic features of the aqua-planet climate are characterised by comparison with Earth. The models display a wide range of behaviour. The balanced component of the tropospheric mean flow, and mid-latitude eddy covariances subject to budget constraints, vary relatively little among the models. In contrast, differences in damping in the dynamical core strongly influence transient eddy amplitudes. Historical uncertainty in modelled lower stratospheric temperatures persists in APE. Aspects of the circulation generated more directly by interactions between the resolved fluid dynamics and parameterized moist processes vary greatly. The tropical Hadley circulation forms either a single or double inter-tropical convergence zone (ITCZ) at the equator, with large variations in mean precipitation. The equatorial wave spectrum shows a wide range of precipitation intensity and propagation characteristics. Kelvin mode-like eastward propagation with remarkably constant phase speed dominates in most models. Westward propagation, less dispersive than the equatorial Rossby modes, dominates in a few models or occurs within an eastward propagating envelope in others. The mean structure of the ITCZ is related to precipitation variability, consistent with previous studies. The aqua-planet global energy balance is unknown but the models produce a surprisingly large range of top of atmosphere global net flux, dominated by differences in shortwave reflection by clouds. A number of newly developed models, not optimised for Earth climate, contribute to this. Possible reasons for differences in the optimised models are discussed. The aqua-planet configuration is intended as one component of an experimental hierarchy used to evaluate AGCMs. This comparison does suggest that the range of model behaviour could be better understood and reduced in conjunction with Earth climate simulations. Controlled experimentation is required to explore individual model behaviour and investigate convergence of the aqua-planet climate with increasing resolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Generally classifiers tend to overfit if there is noise in the training data or there are missing values. Ensemble learning methods are often used to improve a classifier's classification accuracy. Most ensemble learning approaches aim to improve the classification accuracy of decision trees. However, alternative classifiers to decision trees exist. The recently developed Random Prism ensemble learner for classification aims to improve an alternative classification rule induction approach, the Prism family of algorithms, which addresses some of the limitations of decision trees. However, Random Prism suffers like any ensemble learner from a high computational overhead due to replication of the data and the induction of multiple base classifiers. Hence even modest sized datasets may impose a computational challenge to ensemble learners such as Random Prism. Parallelism is often used to scale up algorithms to deal with large datasets. This paper investigates parallelisation for Random Prism, implements a prototype and evaluates it empirically using a Hadoop computing cluster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monthly averaged surface erythemal solar irradiance (UV-Ery) for local noon from 1960 to 2100 has been derived using radiative transfer calculations and projections of ozone, temperature and cloud change from 14 chemistry climate models (CCM), as part of the CCMVal-2 activity of SPARC. Our calculations show the influence of ozone depletion and recovery on erythemal irradiance. In addition, we investigate UV-Ery changes caused by climate change due to increasing greenhouse gas concentrations. The latter include effects of both stratospheric ozone and cloud changes. The derived estimates provide a global picture of the likely changes in erythemal irradiance during the 21st century. Uncertainties arise from the assumed scenarios, different parameterizations – particularly of cloud effects on UV-Ery – and the spread in the CCM projections. The calculations suggest that relative to 1980, annually mean UV-Ery in the 2090s will be on average 12% lower at high latitudes in both hemispheres, 3% lower at mid latitudes, and marginally higher (1 %) in the tropics. The largest reduction (16 %) is projected for Antarctica in October. Cloud effects are responsible for 2–3% of the reduction in UV-Ery at high latitudes, but they slightly moderate it at mid-latitudes (1 %). The year of return of erythemal irradiance to values of certain milestones (1965 and 1980) depends largely on the return of column ozone to the corresponding levels and is associated with large uncertainties mainly due to the spread of the model projections. The inclusion of cloud effects in the calculations has only a small effect of the return years. At mid and high latitudes, changes in clouds and stratospheric ozone transport by global circulation changes due to greenhouse gases will sustain the erythemal irradiance at levels below those in 1965, despite the removal of ozone depleting substances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vicine and convicine are anti-nutritional compounds that accumulate in the cotyledons of faba beans. When humans consume beans with high levels of these compounds, it can cause a condition called favism in individuals harbouring a deficiency in the activity of their glucose-6-phosphate dehydrogenase. When faba beans are used in animal feeds, there can be effects on performance. These concerns have resulted in increasing interest within plant breeding in developing low vicine and convicine faba bean germplasm. In order to facilitate this objective, we developed a rapid and robust screening method for vicine and convicine, capable of distinguishing between faba beans that are either high (wild type) or low in vicine and convicine. In the absence of reliable commercial reference materials, we report an adaptation of a previously published method where a biochemical assay and spectral data were used to confirm the identity of our analytes, vicine and convicine. This method could be readily adopted in other facilities and open the way to the efficient exploitation of diverse germplasm in regions where faba beans play a significant role in human nutrition. We screened a collection of germplasm of interest to a collaborative plant breeding programme developing between the National Institute for Agricultural Botany in the UK and L'Institut Nationale d'Agronomie de Tunisie in Tunisia. We report the results obtained and discuss the prospects for developing molecular markers for the low vicine and convicine trait.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wheat gluten proteins, gliadins and glutenins, are of great importance in determining the unique biomechanical properties of wheat. Studies have therefore been carried out to determine their pathways and mechanisms of synthesis, folding, and deposition in protein bodies. In the present work, a set of transgenic wheat lines has been studied with strongly suppressed levels of γ-gliadins and/or all groups of gliadins, using light and fluorescence microscopy combined with immunodetection using specific antibodies for γ-gliadins and HMW glutenin subunits. These lines represent a unique material to study the formation and fusion of protein bodies in developing seeds of wheat. Higher amounts of HMW subunits were present in most of the transgenic lines but only the lines with suppression of all gliadins showed differences in the formation and fusion of the protein bodies. Large rounded protein bodies were found in the wild-type lines and the transgenic lines with reduced levels of γ-gliadins, while the lines with all gliadins down-regulated had protein bodies of irregular shape and irregular formation. The size and number of inclusions, which have been reported to contain triticins, were also higher in the protein bodies in the lines with all the gliadins down-regulated. Changes in the protein composition and PB morphology reported in the transgenic lines with all gliadins down-regulated did not result in marked changes in the total protein content or instability of the different fractions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.