988 resultados para 789


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The intellectual project of using whiteness as an explicit tool of analysis is not one that has taken root in Britain. However, there are a number of empirical studies that investigate the racialization of white identities. In this article, I look at some empirical sociological fieldwork carried out on white identities in Britain since the early 1990s and identify the key themes arising. These themes are (in)visibility, norms and values, cultural capital and integration, contingent hierarchies and Empire in the present. In Britain, a pertinent distinction is between rural and urban settings for the enactment of white identities vis-a`-vis those of minorities, and there is an exploration of some of the contingency that draws the boundary between ‘white’ and ‘Other’ in different places. Areas of commonality and distinctiveness are noted in terms of the American work. In the last section, I argue that there are a number of issues to resolve around continuing such studies, including linking the micro-level to the macro-level analysis, and expanding to international comparative work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the reformation of spectrum policy and the development of cognitive radio, secondary users will be allowed to access spectrums licensed to primary users. Spectrum auctions can facilitate this secondary spectrum access in a market-driven way. To design an efficient auction framework, we first study the supply and demand pressures and the competitive equilibrium of the secondary spectrum market, considering the spectrum reusability. In well-designed auctions, competition among participants should lead to the competitive equilibrium according to the traditional economic point of view. Then, a discriminatory price spectrum double auction framework is proposed for this market. In this framework, rational participants compete with each other by using bidding prices, and their profits are guaranteed to be non-negative. A near-optimal heuristic algorithm is also proposed to solve the auction clearing problem of the proposed framework efficiently. Experimental results verify the efficiency of the proposed auction clearing algorithm and demonstrate that competition among secondary users and primary users can lead to the competitive equilibrium during auction iterations using the proposed auction framework. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This paper aims to focus on developing critical understanding in human resource management (HRM) students in Aston Business School, UK. The paper reveals that innovative teaching methods encourage deep approaches to study, an indicator of students reaching their own understanding of material and ideas. This improves student employability and satisfies employer need. Design/methodology/approach – Student response to two second year business modules, matched for high student approval rating, was collected through focus group discussion. One module was taught using EBL and the story method, whilst the other used traditional teaching methods. Transcripts were analysed and compared using the structure of the ASSIST measure. Findings – Critical understanding and transformative learning can be developed through the innovative teaching methods of enquiry-based learning (EBL) and the story method. Research limitations/implications – The limitation is that this is a single case study comparing and contrasting two business modules. The implication is that the study should be replicated and developed in different learning settings, so that there are multiple data sets to confirm the research finding. Practical implications – Future curriculum development, especially in terms of HE, still needs to encourage students and lecturers to understand more about the nature of knowledge and how to learn. The application of EBL and the story method is described in a module case study – “Strategy for Future Leaders”. Originality/value – This is a systematic and comparative study to improve understanding of how students and lecturers learn and of the context in which the learning takes place.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An approach for knowledge extraction from the information arriving to the knowledge base input and also new knowledge distribution over knowledge subsets already present in the knowledge base is developed. It is also necessary to realize the knowledge transform into parameters (data) of the model for the following decision-making on the given subset. It is assumed to realize the decision-making with the fuzzy sets’ apparatus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The year so far has been a slow start for many businesses, but at least we have not seen the collapse of as many businesses that we were seeing around two years ago. We are, however, still well and truly in the midst of a global recession. Interest rates are still at an all time low, UK house prices seem to be showing little signs of increase (except in London where everyone still seems to want to live!) and for the ardent shopper there are bargains to be had everywhere. It seems strange that prices on the high street do not seem to have increased in over ten years. Mobile phones, DVD players even furniture seems to be cheaper than they used to be. Whist much of this is down to cheaper manufacturing and the rest could probably be explained by competition within the market place. Does this mean that quality suffered too? Now that we live in a world when if a television is not working it is thrown away and replaced. There was a time when you would take it to some odd looking man that your father would know who could fix it for you. (I remember our local television fix-it man, with his thick rimmed bifocal spectacles and a poor comb-over; he had cardboard boxes full of resistors and electrical wires on the floor of his front room that smelt of soldering irons!) Is this consumerism at an extreme or has this move to disposability made us a better society? Before you think these are just ramblings there is a point to this. According to latest global figures of contact lens sales the vast majority of contact lenses fitted around the world are daily, fortnightly or monthly disposable hydrogel lenses. Certainly in the UK over 90% of lenses are disposable (with daily disposables being the most popular, having a market share of over 50%). This begs the question – is this a good thing? Maybe more importantly, do our patients benefit? I think it is worth reminding ourselves why we went down the disposability route with contact lenses in the first place, and unlike electrical goods it was not just so we did not have to take them for repair! There are the obvious advantages of overcoming problems of breakage and tearing of lenses and the lens deterioration with age. The lenses are less likely to be contaminated and the disinfection is either easier or not required at all (in the case of daily disposable lenses). Probably the landmark paper in the field was the work more commonly known as the ‘Gothenburg Study’. The paper, entitled ‘Strategies for minimizing the Ocular Effects of Extended Contact Lens Wear’ published in the American Journal of Optometry in 1987 (volume 64, pages 781-789) by Holden, B.A., Swarbrick, H.A., Sweeney, D.F., Ho, A., Efron, N., Vannas, A., Nilsson, K.T. They suggested that contact lens induced ocular effects were minimised by: •More frequently removed contact lenses •More regularly replaced contact lenses •A lens that was more mobile on the eye (to allow better removal of debris) •Better flow of oxygen through the lens All of these issues seem to be solved with disposability, except the oxygen issue which has been solved with the advent of silicone hydrogel materials. Newer issues have arisen and most can be solved in practice by the eye care practitioner. The emphasis now seems to be on making lenses more comfortable. The problems of contact lens related dry eyes symptoms seem to be ever present and maybe this would explain why in the UK we have a pretty constant contact lens wearing population of just over three million but every year we have over a million dropouts! That means we must be attracting a million new wearers every year (well done to the marketing departments!) but we are also losing a million wearers every year. We certainly are not losing them all to the refractive surgery clinics. We know that almost anyone can now wear a contact lens and we know that some lenses will solve problems of sharper vision, some will aid comfort, and some will be useful for patients with dry eyes. So if we still have so many dropouts then we must be doing something wrong! I think the take home message has to be ‘must try harder’! I must end with an apology for two errors in my editorial of issue 1 earlier this year. Firstly there was a typo in the first sentence; I meant to state that it was 40 years not 30 years since the first commercial soft lens was available in the UK. The second error was one that I was unaware of until colleagues Geoff Wilson (Birmingham, UK) and Tim Bowden (London, UK) wrote to me to explain that soft lenses were actually available in the UK before 1971 (please see their ‘Letters to the Editor’ in this issue). I am grateful to both of them for correcting the mistake.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A szerző röviden összefoglalja a származtatott termékek árazásával kapcsolatos legfontosabb ismereteket és problémákat. A derivatív árazás elmélete a piacon levő termékek közötti redundanciát kihasználva próbálja meghatározni az egyes termékek relatív árát. Ezt azonban csak teljes piacon lehet megtenni, és így csak teljes piac esetén lehetséges a hasznossági függvények fogalmát az elméletből és a ráépülő gyakorlatból elhagyni, ezért a kockázatsemleges árazás elve félrevezető. Másképpen fogalmazva: a származtatott termékek elmélete csak azon az áron képes a hasznossági függvény fogalmától megszabadulni, ha a piac szerkezetére a valóságban nem teljesülő megkötéseket tesz. Ennek hangsúlyozása mind a piaci gyakorlatban, mind az oktatásban elengedhetetlen. / === / The author sums up briefly the main aspects and problems to do with the pricing of derived products. The theory of derivative pricing uses the redundancy among products on the market to arrive at relative product prices. But this can be done only on a complete market, so that only with a complete market does it become possible to omit from the theory and the practice built upon it the concept of utility functions, and for that reason the principle of risk-neutral pricing is misleading. To put it another way, the theory of derived products is capable of freeing itself from the concept of utility functions only at a price where in practice it places impossible restrictions on the market structure. This it is essential to emphasize in market practice and in teaching.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A distance-based inconsistency indicator, defined by the third author for the consistency-driven pairwise comparisons method, is extended to the incomplete case. The corresponding optimization problem is transformed into an equivalent linear programming problem. The results can be applied in the process of filling in the matrix as the decision maker gets automatic feedback. As soon as a serious error occurs among the matrix elements, even due to a misprint, a significant increase in the inconsistency index is reported. The high inconsistency may be alarmed not only at the end of the process of filling in the matrix but also during the completion process. Numerical examples are also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vol. 26, Issue 88, 8 pages

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Approaches to quantify the organic carbon accumulation on a global scale generally do not consider the small-scale variability of sedimentary and oceanographic boundary conditions along continental margins. In this study, we present a new approach to regionalize the total organic carbon (TOC) content in surface sediments (<5 cm sediment depth). It is based on a compilation of more than 5500 single measurements from various sources. Global TOC distribution was determined by the application of a combined qualitative and quantitative-geostatistical method. Overall, 33 benthic TOC-based provinces were defined and used to process the global distribution pattern of the TOC content in surface sediments in a 1°x1° grid resolution. Regional dependencies of data points within each single province are expressed by modeled semi-variograms. Measured and estimated TOC values show good correlation, emphasizing the reasonable applicability of the method. The accumulation of organic carbon in marine surface sediments is a key parameter in the control of mineralization processes and the material exchange between the sediment and the ocean water. Our approach will help to improve global budgets of nutrient and carbon cycles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In questo lavoro abbiamo sperimentato due modi diversi per ottenere un monolayer di C60 su La,Sr manganite (LSMO): desorbendo C60 da un campione di 5nm cresciuto su un substrato di LSMO e crescendo, sempre su LSMO, sei campioni di C60 a diversi spessori nominali. Il campione desorbito è stato analizzato mediante misure STS ed STM, mentre i campione cresciuti a diversi spessori sono stati misurati mediante non-cntact AFM. Ciò che è emerso in entrambi i casi è che le molecole di C60 non interagiscono con il substrato di LSMO. Nel primo caso infatti si è visto che è stato desorbito quasi tutto il C60 presente sul campione; la superficie della manganite risulta solo parzialmente ricoperta da molecole di C60. Nel secondo caso invece si nota che il C60 cresce formando isole che arrivano a ricoprire la superficie di LSMO solo per film dallo spessore nominale superiore a 30nm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La Teoria di Densità Funzionale (DFT) e la sua versione dipendente dal tempo (TDDFT) sono strumenti largamente usati per simulare e calcolare le proprietà statiche e dinamiche di sistemi con elettroni interagenti. La precisione del metodo si basa su una serie di approssimazioni degli effetti di exchange correlation fra gli elettroni, descritti da un funzionale della sola densità di carica. Nella presente tesi viene testata l'affidabilità del funzionale Mixed Localization Potential (MLP), una media pesata fra Single Orbital Approximation (SOA) e un potenziale di riferimento, ad esempio Local Density Approximation (LDA). I risultati mostrano capacità simulative superiori a LDA per i sistemi statici (curando anche un limite di LDA noto in letteratura come fractional dissociation) e dei progressi per sistemi dinamici quando si sviluppano correnti di carica. Il livello di localizzazione del sistema, inteso come la capacità di un elettrone di tenere lontani da sé altri elettroni, è descritto dalla funzione Electron Localization Function (ELF). Viene studiato il suo ruolo come guida nella costruzione e ottimizzazione del funzionale MLP.