976 resultados para Continuity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The capability of a feature model of immediate memory (Nairne, 1990; Neath, 2000) to predict and account for a relationship between absolute and proportion scoring of immediate serial recall when memory load is varied (the list-length effect, LLE) is examined. The model correctly predicts the novel finding of an LLE in immediate serial order memory similar to that observed with free recall and previously assumed to be attributable to the long-term memory component of that procedure (Glanzer, 1972). The usefulness of formal models as predictive tools and the continuity between short-term serial order and longer term item memory are considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study is to analyse current data continuity mechanisms employed by the target group of businesses and to identify any inadequacies in the mechanisms as a whole. The questionnaire responses indicate that 47% of respondents do perceive backup methodologies as important, with a total of 70% of respondents having some backup methodology already in place. Businesses in Moulton Park perceive the loss of data to have a significant effect upon their business’ ability to function. Only 14% of respondents indicated that loss of data on computer systems would not affect their business at all, with 54% of respondents indicating that there would be either a “major effect” (or greater) on their ability to operate. Respondents that have experienced data loss were more likely to have backup methodologies in place (53%) than respondents that had not experienced data loss (18%). Although the number of respondents clearly affected the quality and conclusiveness of the results returned, the level of backup methodologies in place appears to be proportional to the company size. Further investigation is recommended into the subject in order to validate the information gleaned from the small number of respondents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Postnatal depression (PND) is associated with poor cognitive functioning in infancy and the early school years; long-term effects on academic outcome are not known. Method: Children of postnatally depressed (N = 50) and non-depressed mothers (N = 39), studied from infancy, were followed up at 16 years. We examined the effects on General Certificate of Secondary Education (GCSE) exam performance of maternal depression (postnatal and subsequent) and IQ, child sex and earlier cognitive development, and mother–child interactions, using structural equation modelling (SEM). Results: Boys, but not girls, of PND mothers had poorer GCSE results than control children. This was principally accounted for by effects on early child cognitive functioning, which showed strong continuity from infancy. PND had continuing negative effects on maternal interactions through childhood, and these also contributed to poorer GCSE performance. Neither chronic, nor recent, exposure to maternal depression had significant effects. Conclusions: The adverse effects of PND on male infants’ cognitive functioning may persist through development. Continuing difficulties in mother–child interactions are also important, suggesting that both early intervention and continuing monitoring of mothers with PND may be warranted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a reading of current UK Government policy on recreational access to the countryside of England, in terms of its citizenship and rights agenda. Given the continuity of traditional forms of land tenure and occupation, it is argued that the policy is less of recognition of the changing needs of a tranisitory society than it is a revisionist menifesto for resisting external influence and change. This is particularly so in terms of recreation, where the underlying organisation of the physical environment has been appropriated to reproduce a reflection of the social order which increasingly descriminates between culturally legitimate and illegitimate uses of rural space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article argues that the ethical force of Trinidadian Sam Selvon’s creative writings comes from the particular configuration of living together that he is interested in, both in his Trinidadian novels and his London ones. It reads examples of this living together alongside and in difference that emerges through his focus on the relations between neighbours, friends and lovers, rather than the kinship relations of family. It argues that his works thereby map horizontal zones of attachment and possible solidarities across groupings that reconfigure vertically inscribed genealogical paradigms of belonging to place and each other based on models of historical continuity and inheritance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The characteristics of the boundary layer separating a turbulence region from an irrotational (or non-turbulent) flow region are investigated using rapid distortion theory (RDT). The turbulence region is approximated as homogeneous and isotropic far away from the bounding turbulent/non-turbulent (T/NT) interface, which is assumed to remain approximately flat. Inviscid effects resulting from the continuity of the normal velocity and pressure at the interface, in addition to viscous effects resulting from the continuity of the tangential velocity and shear stress, are taken into account by considering a sudden insertion of the T/NT interface, in the absence of mean shear. Profiles of the velocity variances, turbulent kinetic energy (TKE), viscous dissipation rate (epsilon), turbulence length scales, and pressure statistics are derived, showing an excellent agreement with results from direct numerical simulations (DNS). Interestingly, the normalized inviscid flow statistics at the T/NT interface do not depend on the form of the assumed TKE spectrum. Outside the turbulent region, where the flow is irrotational (except inside a thin viscous boundary layer), epsilon decays as z^{-6}, where z is the distance from the T/NT interface. The mean pressure distribution is calculated using RDT, and exhibits a decrease towards the turbulence region due to the associated velocity fluctuations, consistent with the generation of a mean entrainment velocity. The vorticity variance and epsilon display large maxima at the T/NT interface due to the inviscid discontinuities of the tangential velocity variances existing there, and these maxima are quantitatively related to the thickness delta of the viscous boundary layer (VBL). For an equilibrium VBL, the RDT analysis suggests that delta ~ eta (where eta is the Kolmogorov microscale), which is consistent with the scaling law identified in a very recent DNS study for shear-free T/NT interfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pardo, Patie, and Savov derived, under mild conditions, a Wiener-Hopf type factorization for the exponential functional of proper Lévy processes. In this paper, we extend this factorization by relaxing a finite moment assumption as well as by considering the exponential functional for killed Lévy processes. As a by-product, we derive some interesting fine distributional properties enjoyed by a large class of this random variable, such as the absolute continuity of its distribution and the smoothness, boundedness or complete monotonicity of its density. This type of results is then used to derive similar properties for the law of maxima and first passage time of some stable Lévy processes. Thus, for example, we show that for any stable process with $\rho\in(0,\frac{1}{\alpha}-1]$, where $\rho\in[0,1]$ is the positivity parameter and $\alpha$ is the stable index, then the first passage time has a bounded and non-increasing density on $\mathbb{R}_+$. We also generate many instances of integral or power series representations for the law of the exponential functional of Lévy processes with one or two-sided jumps. The proof of our main results requires different devices from the one developed by Pardo, Patie, Savov. It relies in particular on a generalization of a transform recently introduced by Chazal et al together with some extensions to killed Lévy process of Wiener-Hopf techniques. The factorizations developed here also allow for further applications which we only indicate here also allow for further applications which we only indicate here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The distinction between learning to perform on an instrument or voice and learning music in a wider sense is one that is made in many countries, and is especially pertinent in England in the context of recent policy developments. This article argues that, whilst this distinction has come to represent curricula based on the opposing paradigms of behaviourist and constructivist approaches to learning, this opposition does not necessarily extend to the pedagogy through which the curricula are taught. A case study of the National Curriculum in England highlights the characteristics of a curriculum based on constructivist principles, along with the impact this has when taught in a behaviourist way. It is argued that conceiving the curriculum in terms of musical competencies and pedagogy in terms of musical understanding would provide a basis for greater continuity and higher quality in the music education experienced by young people.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the Dirichlet boundary-value problem for the Helmholtz equation, Au + x2u = 0, with Imx > 0. in an hrbitrary bounded or unbounded open set C c W. Assuming continuity of the solution up to the boundary and a bound on growth a infinity, that lu(x)l < Cexp (Slxl), for some C > 0 and S~< Imx, we prove that the homogeneous problem has only the trivial salution. With this resnlt we prove uniqueness results for direct and inverse problems of scattering by a bounded or infinite obstacle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clinical pathway is an approach to standardise care processes to support the implementations of clinical guidelines and protocols. It is designed to support the management of treatment processes including clinical and non-clinical activities, resources and also financial aspects. It provides detailed guidance for each stage in the management of a patient with the aim of improving the continuity and coordination of care across different disciplines and sectors. However, in the practical treatment process, the lack of knowledge sharing and information accuracy of paper-based clinical pathways burden health-care staff with a large amount of paper work. This will often result in medical errors, inefficient treatment process and thus poor quality medical services. This paper first presents a theoretical underpinning and a co-design research methodology for integrated pathway management by drawing input from organisational semiotics. An approach to integrated clinical pathway management is then proposed, which aims to embed pathway knowledge into treatment processes and existing hospital information systems. The capability of this approach has been demonstrated through the case study in one of the largest hospitals in China. The outcome reveals that medical quality can be improved significantly by the classified clinical pathway knowledge and seamless integration with hospital information systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the recent expansion in studies of medieval women, uncertainty surrounds their married lives due to the social and legal constraints that existed at that time. Here it is argued that feet of fines provide a lens, albeit partial, on the activities of married women who were effectively managing the disposal and inheritance of their landed estates. At the same time the importance to the purchaser of ensuring the lawful acquisition of the property is also observed. As a result, greater insights into married women and their property in the fourteenth and fifteenth centuries are obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Historic geomagnetic activity observations have been used to reveal centennial variations in the open solar flux and the near-Earth heliospheric conditions (the interplanetary magnetic field and the solar wind speed). The various methods are in very good agreement for the past 135 years when there were sufficient reliable magnetic observatories in operation to eliminate problems due to site-specific errors and calibration drifts. This review underlines the physical principles that allow these reconstructions to be made, as well as the details of the various algorithms employed and the results obtained. Discussion is included of: the importance of the averaging timescale; the key differences between “range” and “interdiurnal variability” geomagnetic data; the need to distinguish source field sector structure from heliospherically-imposed field structure; the importance of ensuring that regressions used are statistically robust; and uncertainty analysis. The reconstructions are exceedingly useful as they provide calibration between the in-situ spacecraft measurements from the past five decades and the millennial records of heliospheric behaviour deduced from measured abundances of cosmogenic radionuclides found in terrestrial reservoirs. Continuity of open solar flux, using sunspot number to quantify the emergence rate, is the basis of a number of models that have been very successful in reproducing the variation derived from geomagnetic activity. These models allow us to extend the reconstructions back to before the development of the magnetometer and to cover the Maunder minimum. Allied to the radionuclide data, the models are revealing much about how the Sun and heliosphere behaved outside of grand solar maxima and are providing a means of predicting how solar activity is likely to evolve now that the recent grand maximum (that had prevailed throughout the space age) has come to an end.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim Earth observation (EO) products are a valuable alternative to spectral vegetation indices. We discuss the availability of EO products for analysing patterns in macroecology, particularly related to vegetation, on a range of spatial and temporal scales. Location Global. Methods We discuss four groups of EO products: land cover/cover change, vegetation structure and ecosystem productivity, fire detection, and digital elevation models. We address important practical issues arising from their use, such as assumptions underlying product generation, product accuracy and product transferability between spatial scales. We investigate the potential of EO products for analysing terrestrial ecosystems. Results Land cover, productivity and fire products are generated from long-term data using standardized algorithms to improve reliability in detecting change of land surfaces. Their global coverage renders them useful for macroecology. Their spatial resolution (e.g. GLOBCOVER vegetation, 300 m; MODIS vegetation and fire, ≥ 500 m; ASTER digital elevation, 30 m) can be a limiting factor. Canopy structure and productivity products are based on physical approaches and thus are independent of biome-specific calibrations. Active fire locations are provided in near-real time, while burnt area products show actual area burnt by fire. EO products can be assimilated into ecosystem models, and their validation information can be employed to calculate uncertainties during subsequent modelling. Main conclusions Owing to their global coverage and long-term continuity, EO end products can significantly advance the field of macroecology. EO products allow analyses of spatial biodiversity, seasonal dynamics of biomass and productivity, and consequences of disturbances on regional to global scales. Remaining drawbacks include inter-operability between products from different sensors and accuracy issues due to differences between assumptions and models underlying the generation of different EO products. Our review explains the nature of EO products and how they relate to particular ecological variables across scales to encourage their wider use in ecological applications.