863 resultados para automatically generated meta classifiers with large levels
Resumo:
This paper proposes hybrid capital securities as a significant part of senior bank executive incentive compensation in light of Basel III, a new global regulatory standard on bank capital adequacy and liquidity agreed by the members of the Basel Committee on Banking Supervision. The committee developed Basel III in a response to the deficiencies in financial regulation brought about by the global financial crisis. Basel III strengthens bank capital requirements and introduces new regulatory requirements on bank liquidity and bank leverage. The hybrid bank capital securities we propose for bank executives’ compensation are preferred shares and subordinated debt that the June 2004 Basel II regulatory framework recognised as other admissible forms of capital. The past two decades have witnessed dramatic increase in performance-related pay in the banking industry. Stakeholders such as shareholders, debtholders and regulators criticise traditional cash and equity-based compensation for encouraging bank executives’ excessive risk taking and short-termism, which has resulted in the failure of risk management in high profile banks during the global financial crisis. Paying compensation in the form of hybrid bank capital securities may align the interests of executives with those of stakeholders and help banks regain their reputation for prudence after years of aggressive risk-taking. Additionally, banks are desperately seeking to raise capital in order to bolster balance sheets damaged by the ongoing credit crisis. Tapping their own senior employees with large incentive compensation packages may be a viable additional source of capital that is politically acceptable in times of large-scale bailouts of the financial sector and economically wise as it aligns the interests of the executives with the need for a stable financial system.
Resumo:
Gaining public acceptance is one of the main issues with large-scale low-carbon projects such as hydropower development. It has been recommended by the World Commission on Dams that to gain public acceptance, publicinvolvement is necessary in the decision-making process (WCD, 2000). As financially-significant actors in the planning and implementation of large-scale hydropowerprojects in developing country contexts, the paper examines the ways in which publicinvolvement may be influenced by international financial institutions. Using the casestudy of the NamTheun2HydropowerProject in Laos, the paper analyses how publicinvolvement facilitated by the Asian Development Bank had a bearing on procedural and distributional justice. The paper analyses the extent of publicparticipation and the assessment of full social and environmental costs of the project in the Cost-Benefit Analysis conducted during the projectappraisal stage. It is argued that while efforts were made to involve the public, there were several factors that influenced procedural and distributional justice: the late contribution of the Asian Development Bank in the projectappraisal stage; and the issue of non-market values and discount rate to calculate the full social and environmental costs.
Resumo:
Climate simulations by 16 atmospheric general circulation models (AGCMs) are compared on an aqua-planet, a water-covered Earth with prescribed sea surface temperature varying only in latitude. The idealised configuration is designed to expose differences in the circulation simulated by different models. Basic features of the aqua-planet climate are characterised by comparison with Earth. The models display a wide range of behaviour. The balanced component of the tropospheric mean flow, and mid-latitude eddy covariances subject to budget constraints, vary relatively little among the models. In contrast, differences in damping in the dynamical core strongly influence transient eddy amplitudes. Historical uncertainty in modelled lower stratospheric temperatures persists in APE. Aspects of the circulation generated more directly by interactions between the resolved fluid dynamics and parameterized moist processes vary greatly. The tropical Hadley circulation forms either a single or double inter-tropical convergence zone (ITCZ) at the equator, with large variations in mean precipitation. The equatorial wave spectrum shows a wide range of precipitation intensity and propagation characteristics. Kelvin mode-like eastward propagation with remarkably constant phase speed dominates in most models. Westward propagation, less dispersive than the equatorial Rossby modes, dominates in a few models or occurs within an eastward propagating envelope in others. The mean structure of the ITCZ is related to precipitation variability, consistent with previous studies. The aqua-planet global energy balance is unknown but the models produce a surprisingly large range of top of atmosphere global net flux, dominated by differences in shortwave reflection by clouds. A number of newly developed models, not optimised for Earth climate, contribute to this. Possible reasons for differences in the optimised models are discussed. The aqua-planet configuration is intended as one component of an experimental hierarchy used to evaluate AGCMs. This comparison does suggest that the range of model behaviour could be better understood and reduced in conjunction with Earth climate simulations. Controlled experimentation is required to explore individual model behaviour and investigate convergence of the aqua-planet climate with increasing resolution.
Resumo:
Generally classifiers tend to overfit if there is noise in the training data or there are missing values. Ensemble learning methods are often used to improve a classifier's classification accuracy. Most ensemble learning approaches aim to improve the classification accuracy of decision trees. However, alternative classifiers to decision trees exist. The recently developed Random Prism ensemble learner for classification aims to improve an alternative classification rule induction approach, the Prism family of algorithms, which addresses some of the limitations of decision trees. However, Random Prism suffers like any ensemble learner from a high computational overhead due to replication of the data and the induction of multiple base classifiers. Hence even modest sized datasets may impose a computational challenge to ensemble learners such as Random Prism. Parallelism is often used to scale up algorithms to deal with large datasets. This paper investigates parallelisation for Random Prism, implements a prototype and evaluates it empirically using a Hadoop computing cluster.
Resumo:
Monthly averaged surface erythemal solar irradiance (UV-Ery) for local noon from 1960 to 2100 has been derived using radiative transfer calculations and projections of ozone, temperature and cloud change from 14 chemistry climate models (CCM), as part of the CCMVal-2 activity of SPARC. Our calculations show the influence of ozone depletion and recovery on erythemal irradiance. In addition, we investigate UV-Ery changes caused by climate change due to increasing greenhouse gas concentrations. The latter include effects of both stratospheric ozone and cloud changes. The derived estimates provide a global picture of the likely changes in erythemal irradiance during the 21st century. Uncertainties arise from the assumed scenarios, different parameterizations – particularly of cloud effects on UV-Ery – and the spread in the CCM projections. The calculations suggest that relative to 1980, annually mean UV-Ery in the 2090s will be on average 12% lower at high latitudes in both hemispheres, 3% lower at mid latitudes, and marginally higher (1 %) in the tropics. The largest reduction (16 %) is projected for Antarctica in October. Cloud effects are responsible for 2–3% of the reduction in UV-Ery at high latitudes, but they slightly moderate it at mid-latitudes (1 %). The year of return of erythemal irradiance to values of certain milestones (1965 and 1980) depends largely on the return of column ozone to the corresponding levels and is associated with large uncertainties mainly due to the spread of the model projections. The inclusion of cloud effects in the calculations has only a small effect of the return years. At mid and high latitudes, changes in clouds and stratospheric ozone transport by global circulation changes due to greenhouse gases will sustain the erythemal irradiance at levels below those in 1965, despite the removal of ozone depleting substances.
Resumo:
Vicine and convicine are anti-nutritional compounds that accumulate in the cotyledons of faba beans. When humans consume beans with high levels of these compounds, it can cause a condition called favism in individuals harbouring a deficiency in the activity of their glucose-6-phosphate dehydrogenase. When faba beans are used in animal feeds, there can be effects on performance. These concerns have resulted in increasing interest within plant breeding in developing low vicine and convicine faba bean germplasm. In order to facilitate this objective, we developed a rapid and robust screening method for vicine and convicine, capable of distinguishing between faba beans that are either high (wild type) or low in vicine and convicine. In the absence of reliable commercial reference materials, we report an adaptation of a previously published method where a biochemical assay and spectral data were used to confirm the identity of our analytes, vicine and convicine. This method could be readily adopted in other facilities and open the way to the efficient exploitation of diverse germplasm in regions where faba beans play a significant role in human nutrition. We screened a collection of germplasm of interest to a collaborative plant breeding programme developing between the National Institute for Agricultural Botany in the UK and L'Institut Nationale d'Agronomie de Tunisie in Tunisia. We report the results obtained and discuss the prospects for developing molecular markers for the low vicine and convicine trait.
Resumo:
Wheat gluten proteins, gliadins and glutenins, are of great importance in determining the unique biomechanical properties of wheat. Studies have therefore been carried out to determine their pathways and mechanisms of synthesis, folding, and deposition in protein bodies. In the present work, a set of transgenic wheat lines has been studied with strongly suppressed levels of γ-gliadins and/or all groups of gliadins, using light and fluorescence microscopy combined with immunodetection using specific antibodies for γ-gliadins and HMW glutenin subunits. These lines represent a unique material to study the formation and fusion of protein bodies in developing seeds of wheat. Higher amounts of HMW subunits were present in most of the transgenic lines but only the lines with suppression of all gliadins showed differences in the formation and fusion of the protein bodies. Large rounded protein bodies were found in the wild-type lines and the transgenic lines with reduced levels of γ-gliadins, while the lines with all gliadins down-regulated had protein bodies of irregular shape and irregular formation. The size and number of inclusions, which have been reported to contain triticins, were also higher in the protein bodies in the lines with all the gliadins down-regulated. Changes in the protein composition and PB morphology reported in the transgenic lines with all gliadins down-regulated did not result in marked changes in the total protein content or instability of the different fractions.
Resumo:
Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.
Resumo:
tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.
Resumo:
The Weather Research and Forecasting model was applied to analyze variations in the planetary boundary layer (PBL) structure over Southeast England including central and suburban London. The parameterizations and predictive skills of two nonlocal mixing PBL schemes, YSU and ACM2, and two local mixing PBL schemes, MYJ and MYNN2, were evaluated over a variety of stability conditions, with model predictions at a 3 km grid spacing. The PBL height predictions, which are critical for scaling turbulence and diffusion in meteorological and air quality models, show significant intra-scheme variance (> 20%), and the reasons are presented. ACM2 diagnoses the PBL height thermodynamically using the bulk Richardson number method, which leads to a good agreement with the lidar data for both unstable and stable conditions. The modeled vertical profiles in the PBL, such as wind speed, turbulent kinetic energy (TKE), and heat flux, exhibit large spreads across the PBL schemes. The TKE predicted by MYJ were found to be too small and show much less diurnal variation as compared with observations over London. MYNN2 produces better TKE predictions at low levels than MYJ, but its turbulent length scale increases with height in the upper part of the strongly convective PBL, where it should decrease. The local PBL schemes considerably underestimate the entrainment heat fluxes for convective cases. The nonlocal PBL schemes exhibit stronger mixing in the mean wind fields under convective conditions than the local PBL schemes and agree better with large-eddy simulation (LES) studies.
Resumo:
Organisations typically define and execute their selected strategy by developing and managing a portfolio of projects. The governance of this portfolio has proved to be a major challenge, particularly for large organisations. Executives and managers face even greater pressures when the nature of the strategic landscape is uncertain. This paper explores approaches for dealing with different levels of certainty in business IT projects and provides a contingent governance framework. Historically business IT projects have relied on a structured sequential approach, also referred to as a waterfall method. There is a distinction between the development stages of a solution and the management stages of a project that delivers the solution although these are often integrated in a business IT systems project. Prior research has demonstrated that the level of certainty varies between development projects. There can be uncertainty on what needs to be developed and also on how this solution should be developed. The move to agile development and management reflects a greater level of uncertainty often on both dimensions and this has led the adoption of more iterative approaches. What has been less well researched is the impact of uncertainty on the governance of the change portfolio and the corresponding implications for business executives. This paper poses this research question and proposes a govemance framework to address these aspects. The governance framework has been reviewed in the context of a major anonymous organisation, FinOrg. Findings are reported in this paper with a focus on the need to apply different approaches. In particular, the governance of uncertain business change is contrasted with the management approach for defined IT projects. Practical outputs from the paper include a consideration of some innovative approaches that can be used by executives. It also investigates the role of the business change portfolio group in evaluating and executing the appropriate level of governance. These results lead to recommendations for executives and also proposed further research.
Resumo:
Autism spectrum conditions (autism) affect ~1% of the population and are characterized by deficits in social communication. Oxytocin has been widely reported to affect social-communicative function and its neural underpinnings. Here we report the first evidence that intranasal oxytocin administration improves a core problem that individuals with autism have in using eye contact appropriately in real-world social settings. A randomized double-blind, placebo-controlled, within-subjects design is used to examine how intranasal administration of 24 IU of oxytocin affects gaze behavior for 32 adult males with autism and 34 controls in a real-time interaction with a researcher. This interactive paradigm bypasses many of the limitations encountered with conventional static or computer-based stimuli. Eye movements are recorded using eye tracking, providing an objective measurement of looking patterns. The measure is shown to be sensitive to the reduced eye contact commonly reported in autism, with the autism group spending less time looking to the eye region of the face than controls. Oxytocin administration selectively enhanced gaze to the eyes in both the autism and control groups (transformed mean eye-fixation difference per second=0.082; 95% CI:0.025–0.14, P=0.006). Within the autism group, oxytocin has the most effect on fixation duration in individuals with impaired levels of eye contact at baseline (Cohen’s d=0.86). These findings demonstrate that the potential benefits of oxytocin in autism extend to a real-time interaction, providing evidence of a therapeutic effect in a key aspect of social communication.
Resumo:
Plants produce volatile organic compounds (VOCs) in response to herbivore attack, and these VOCs can be used by parasitoids of the herbivore as host location cues. We investigated the behavioural responses of the parasitoid Cotesia vestalis to VOCs from a plant–herbivore complex consisting of cabbage plants (Brassica oleracea) and the parasitoids host caterpillar, Plutella xylostella. A Y-tube olfactometer was used to compare the parasitoids' responses to VOCs produced as a result of different levels of attack by the caterpillar and equivalent levels of mechanical damage. Headspace VOC production by these plant treatments was examined using gas chromatography–mass spectrometry. Cotesia vestalis were able to exploit quantitative and qualitative differences in volatile emissions, from the plant–herbivore complex, produced as a result of different numbers of herbivores feeding. Cotesia vestalis showed a preference for plants with more herbivores and herbivore damage, but did not distinguish between different levels of mechanical damage. Volatile profiles of plants with different levels of herbivores/herbivore damage could also be separated by canonical discriminant analyses. Analyses revealed a number of compounds whose emission increased significantly with herbivore load, and these VOCs may be particularly good indicators of herbivore number, as the parasitoid processes cues from its external environment
Resumo:
Liquid–vapour homogenisation temperatures of fluid inclusions in stalagmites are used for quantitative temperature reconstructions in paleoclimate research. Specifically for this application, we have developed a novel heating/cooling stage that can be operated with large stalagmite sections of up to 17 × 35 mm2 to simplify and improve the chronological reconstruction of paleotemperature time-series. The stage is designed for use of an oil immersion objective and a high-NA condenser front lens to obtain high-resolution images for bubble radius measurements. The temperature accuracy of the stage is better than ± 0.1 °C with a precision (reproducibility) of ± 0.02 °C.
Resumo:
Explaining the diversity of languages across the world is one of the central aims of typological, historical, and evolutionary linguistics. We consider the effect of language contact-the number of non-native speakers a language has-on the way languages change and evolve. By analysing hundreds of languages within and across language families, regions, and text types, we show that languages with greater levels of contact typically employ fewer word forms to encode the same information content (a property we refer to as lexical diversity). Based on three types of statistical analyses, we demonstrate that this variance can in part be explained by the impact of non-native speakers on information encoding strategies. Finally, we argue that languages are information encoding systems shaped by the varying needs of their speakers. Language evolution and change should be modeled as the co-evolution of multiple intertwined adaptive systems: On one hand, the structure of human societies and human learning capabilities, and on the other, the structure of language.