971 resultados para Harder
Resumo:
The “case for real estate” in the mixed-asset portfolio is a topic of continuing interest to practitioners and academics. The argument is typically made by comparing efficient frontiers of portfolio with real estate to those that exclude real estate. However, most investors will have held inefficient portfolios. Thus, when analysing the real estate’s place in the mixed-asset portfolio it seems illogical to do so by comparing the difference in risk-adjusted performance between efficient portfolios, which few if any investor would have held. The approach adopted here, therefore, is to compare the risk-adjusted performance of a number of mixed-asset portfolios without real estate (which may or not be efficient) with a very large number of mixed-asset portfolios that include real estate (which again may or may not be efficient), to see the proportion of the time when there is an increase in risk-adjusted performance, significant or otherwise using appraisal-based and de-smoothed annual data from 1952-2003. So to the question how often does the addition of private real estate lead to increases the risk-adjusted performance compared with mixed-asset portfolios without real estate the answer is almost all the time. However, significant increases are harder to find. Additionally, a significant increase in risk-adjusted performance can come from either reductions in portfolio risk or increases in return depending on the investors’ initial portfolio structure. In other words, simply adding real estate to a mixed-asset portfolio is not enough to ensure significant increases in performance as the results are dependent on the percentage added and the proper reallocation of the initial portfolio mix in the expanded portfolio.
Resumo:
This chapter looks at books which will support the very young through their earliest childhood and first few years of schooling. Learning to read can be hard but it is a lot harder if you never encounter the sort of literature that will engage you and motivate you to see the whole learning process as worthwhile. The chapter considers ways to share great texts with young children and how to select good books to share.
Resumo:
Background: High levels of multidimensional perfectionism may be dysfunctional in their own right and can also impact on the maintenance and treatment of Axis I psychiatric disorders. Aims: This paper sought to describe the behavioural expressions and imagery associated with perfectionism in a non-clinical sample. Method: Participants (n=59) completed a newly developed questionnaire to assess behavioural expressions of perfectionism, and an adapted interview to assess perfectionism-related intrusive mental images. Results: The study found that those high in perfectionism took longer to complete tasks, experienced more checking and safety behaviour whilst carrying out tasks, and had greater trouble actually completing tasks compared to those low in perfectionism. In addition, those with higher levels of perfectionism experienced intrusive mental imagery, which was more distressing, harder to dismiss, and had more impact on behaviour than those with lower levels of perfectionism. Conclusions: This research provides an initial exploration of the specific behaviours and intrusive mental imagery associated with perfectionism. The new behavioural measure of perfectionism could prove useful clinically in the assessment of change; however, these findings are preliminary and warrant replication in a clinical sample in order to examine their treatment implications.
Resumo:
A discrete element model is used to study shear rupture of sea ice under convergent wind stresses. The model includes compressive, tensile, and shear rupture of viscous elastic joints connecting floes that move under the action of the wind stresses. The adopted shear rupture is governed by Coulomb’s criterion. The ice pack is a 400 km long square domain consisting of 4 km size floes. In the standard case with tensile strength 10 times smaller than the compressive strength, under uniaxial compression the failure regime is mainly shear rupture with the most probable scenario corresponding to that with the minimum failure work. The orientation of cracks delineating formed aggregates is bimodal with the peaks around the angles given by the wing crack theory determining diamond-shaped blocks. The ice block (floe aggregate) size decreases as the wind stress gradient increases since the elastic strain energy grows faster leading to a higher speed of crack propagation. As the tensile strength grows, shear rupture becomes harder to attain and compressive failure becomes equally important leading to elongation of blocks perpendicular to the compression direction and the blocks grow larger. In the standard case, as the wind stress confinement ratio increases the failure mode changes at a confinement ratio within 0.2–0.4, which corresponds to the analytical critical confinement ratio of 0.32. Below this value, the cracks are bimodal delineating diamond shape aggregates, while above this value failure becomes isotropic and is determined by small-scale stress anomalies due to irregularities in floe shape.
Resumo:
This article considers the BBC External Service's East German wing, which broadcast from 1949 on, focusing on the continuities in broadcast techniques between wartime anti-Nazi programming and slots such as 'Two Comrades' and 'The Bewildered Newspaper Reader', both of which replicated pre-1945 formats. The role of the Foreign Office, both as cold warrior in the 1940s and force for detente in the 1970s is included. The article also investigates the response from German listeners to the BBC's external service broadcasting in the 1950s and 1960s. The BBC paid special attention to its German listeners, and has preserved a large number of original letters at the Written Archive at Caversham, as well as conducting regular listener surveys. These considered whether Britain's democratic agenda was getting across in the late 1940s and 50s, but the author also considers to what extent German listeners were pressing for a harder stance in the Cold War or were urging caution on the great powers deciding their fate.
Resumo:
Contemporary acquisition theorizing has placed a considerable amount of attention on interfaces, points at which different linguistic modules interact. The claim is that vulnerable interfaces cause particular difficulties in L1, bilingual and adult L2 acquisition (e.g. Platzack, 2001; Montrul, 2004; Müller and Hulk, 2001; Sorace, 2000, 2003, 2004, 2005). Accordingly, it is possible that deficits at the syntax–pragmatics interface cause what appears to be particular non-target-like syntactic behavior in L2 performance. This syntax-before-discourse hypothesis is examined in the present study by analyzing null vs. overt subject pronoun distribution in L2 Spanish of English L1 learners. As ultimately determined by L2 knowledge of the Overt Pronoun Constraint (OPC) (Montalbetti, 1984), the data indicate that L2 learners at the intermediate and advanced levels reset the Null Subject Parameter (NSP), but only advanced learners have acquired a more or less target null/overt subject distribution. Against the predictions of Sorace (2004) and in line with Montrul and Rodríguez-Louro (2006), the data indicate an overuse of both overt and null subject pronouns. As a result, this behavior cannot be from L1 interference alone, suggesting that interface-conditioned properties are simply more complex and therefore, harder to acquire. Furthermore, the data from the advanced learners demonstrate that the syntax–pragmatics interface is not a predetermined locus for fossilization (in contra e.g. Valenzuela, 2006).
Resumo:
For children with developmental dyslexia the already challenging task of learning to read is made harder by difficulties with phonological processing and perceptual distortions. As a result, these children may be less motivated to practise their literacy skills. This is problematic in that literacy can only be gained through constant and continued exposure to reading scenarios, and children who are unmotivated to practise are unlikely to develop into fluent readers. Children are active in choosing the books they read and it is therefore important to understand how the typography in those books influences their choice. Research with typically developing children has shown that they have clear opinions about the typography in their reading materials and that these opinions are likely to influence their motivation to read particular books. However, it cannot be assumed that children with reading difficulties read and respond to texts in the same way as children who do not struggle. Through case-studies of three children with reading difficulties, preferences for the typography in their reading books is examined. Looking at elements of typesetting such as spacing and size shows that this group of children is aware of differences in typography and that they have preferences for how their reading books are typeset. These children showed a preference for books that resembled those that their peers are reading rather than those that would, by typographic convention, be considered easier to read. This study is part of ongoing research into the development of alternative materials for teaching literacy skills to children with dyslexia.
Resumo:
The Distribution Network Operators (DNOs) role is becoming more difficult as electric vehicles and electric heating penetrate the network, increasing the demand. As a result it becomes harder for the distribution networks infrastructure to remain within its operating constraints. Energy storage is a potential alternative to conventional network reinforcement such as upgrading cables and transformers. The research presented here in this paper shows that due to the volatile nature of the LV network, the control approach used for energy storage has a significant impact on performance. This paper presents and compares control methodologies for energy storage where the objective is to get the greatest possible peak demand reduction across the day from a pre-specified storage device. The results presented show the benefits and detriments of specific types of control on a storage device connected to a single phase of an LV network, using aggregated demand profiles based on real smart meter data from individual homes. The research demonstrates an important relationship between how predictable an aggregation is and the best control methodology required to achieve the objective.
Resumo:
Neutral cues that predict emotional events (emotional harbingers) acquire emotional properties and attract attention. Given the importance of emotional harbingers for future survival, it is desirable to flexibly learn new facts about emotional harbingers when needed. However, recent research revealed that it is harder to learn new associations for emotional harbingers than cues that predict non-emotional events (neutral harbingers). In the current study, we addressed whether this impaired association learning for emotional harbingers is altered by one’s awareness of the contingencies between cues and emotional outcomes. Across 3 studies, we found that one’s awareness of the contingencies determines subsequent association learning of emotional harbingers. Emotional harbingers produced worse association learning than neutral harbingers when people were not aware of the contingencies between cues and emotional outcomes, but produced better association learning when people were aware of the contingencies. These results suggest that emotional harbingers do not always suffer from impaired association learning and can show facilitated learning depending on one’s contingency awareness.
Resumo:
Solar Stormwatch was the first space weather citizen science project, the aim of which was to identify and track coronal mass ejections (CMEs) observed by the Heliospheric Imagers aboard the STEREO satellites. The project has now been running for approximately 4 years, with input from >16000 citizen scientists, resulting in a dataset of >38000 time-elongation profiles of CME trajectories, observed over 18 pre-selected position angles. We present our method for reducing this data set into aCME catalogue. The resulting catalogue consists of 144 CMEs over the period January-2007 to February-2010, of which 110 were observed by STEREO-A and 77 were observed by STEREO-B. For each CME, the time-elongation profiles generated by the citizen scientists are averaged into a consensus profile along each position angle that the event was tracked. We consider this catalogue to be unique, being at present the only citizen science generated CME catalogue, tracking CMEs over an elongation range of 4 degrees out to a maximum of approximately 70 degrees. Using single spacecraft fitting techniques, we estimate the speed, direction, solar source region and latitudinal width of each CME. This shows that, at present, the Solar Stormwatch catalogue (which covers only solar minimum years) contains almost exclusively slow CMEs, with a mean speed of approximately 350 kms−1. The full catalogue is available for public access at www.met.reading.ac.uk/spate/stormwatch. This includes, for each event, the unprocessed time-elongation profiles generated by Solar Stormwatch, the consensus time-elongation profiles and a set of summary plots, as well as the estimated CME properties.
Resumo:
This paper presents a comparison of various estimates of the open solar flux, deduced from measurements of the interplanetary magnetic field, from the aa geomagnetic index and from photospheric magnetic field observations. The first two of these estimates are made using the Ulysses discovery that the radial heliospheric field is approximately independent of heliographic latitude, the third makes use of the potential-field source surface method to map the total flux through the photosphere to the open flux at the top of the corona. The uncertainties associated with using the Ulysses result are 5%, but the effects of the assumptions of the potential field source surface method are harder to evaluate. Nevertheless, the three methods give similar results for the last three solar cycles when the data sets overlap. In 11-year running means, all three methods reveal that 1987 marked a significant peak in the long-term variation of the open solar flux. This peak is close to the solar minimum between sunspot cycles 21 and 22, and consequently the mean open flux (averaged from minimum to minimum) is similar for these two cycles. However, this similarity between cycles 21 and 22 in no way implies that the open flux is constant. The long-term variation shows that these cycles are fundamentally different in that the average open flux was rising during cycle 21 (from consistently lower values in cycle 20 and toward the peak in 1987) but was falling during cycle 22 (toward consistently lower values in cycle 23). The estimates from the geomagnetic aa index are unique as they extend from 1842 onwards (using the Helsinki extension). This variation gives strong anticorrelations, with very high statistical significance levels, with cosmic ray fluxes and with the abundances of the cosmogenic isotopes that they produce. Thus observations of photospheric magnetic fields, of cosmic ray fluxes, and of cosmogenic isotope abundances all support the long-term drifts in open solar flux reported by Lockwood et al. [1999a, 1999b].
Resumo:
The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.
Resumo:
Any reduction in global mean near-surface temperature due to a future decline in solar activity is likely to be a small fraction of projected anthropogenic warming. However, variability in ultraviolet solar irradiance is linked to modulation of the Arctic and North Atlantic Oscillations, suggesting the potential for larger regional surface climate effects. Here, we explore possible impacts through two experiments designed to bracket uncertainty in ultraviolet irradiance in a scenario in which future solar activity decreases to Maunder Minimum-like conditions by 2050. Both experiments show regional structure in the wintertime response, resembling the North Atlantic Oscillation, with enhanced relative cooling over northern Eurasia and the eastern United States. For a high-end decline in solar ultraviolet irradiance, the impact on winter northern European surface temperatures over the late twenty-first century could be a significant fraction of the difference in climate change between plausible AR5 scenarios of greenhouse gas concentrations.
Resumo:
The Solar Stormwatch team reviews progress and prospects for this highly effective citizen-science project focused on the Sun.
Resumo:
The aim of this study was to evaluate the effects of inulin as fat replacer on short dough biscuits and their corresponding doughs. A control formulation, with no replacement, and four formulations in which 10, 20, 30, and 40 % of shortening was replaced by inulin were studied. In the dough, shortening was observed surrounding flour components. At higher fat replacement levels, flour was more available for hydration leading to significant (P<0.05) harder doughs: from 2.76 (0.12)N in 10 % fat-replaced biscuits to 5.81 (1.56)N in 30 % fat-replaced ones. Biscuit structure was more continuous than dough structure. A continuous fat layer coated the matrix surface, where starch granules were embedded. In general, weight loss during baking and water activity decreased significantly (P<0.05) as fat replacement increased. Biscuit dimensions and aeration decreased when fat replacement increased, e.g., width gain was +1.20 mm in 10 fat-replaced biscuits and only +0.32 mm in 40 % fat-replaced ones. Panelist found biscuits with 20 % of fat replacement slightly harder than control biscuits. It can be concluded that shortening may be partially replaced, up to 20 %, with inulin. These low fat biscuits are similar than the control biscuits, and they can have additional health benefits derived from inulin presence.