12 resultados para Adolescence in conflict with the law
em Aston University Research Archive
Resumo:
It is an old adage that "you cannot manage what you cannot measure", yet pharmaceutical managers annually commit 30 per cent of turnover to the promotion of their products without measuring the effect of their investment. This unsatisfactory state of affairs has persisted for over 20 years and, judging by comments at the recent Sales Force Effectiveness conference, seems set to continue.
Resumo:
The recent White Paper, 'Modern Local Government: In Touch with the People' summarised Labour's project to modernise local government and to renew local democracy. Through the mediating concepts of accountability, responsiveness and representation, it is argued that the modernisation project will renew local authorities' political authority and legitimacy. However, a critical review of the White Paper and other Government's publications which discuss the modernisation of local government suggests that there are discrepancies between the claims to improve democratic local government and the role of councils in the provision of nationally decided andffunded welfare services.
Resumo:
Attitudes towards the environment can be manifest in two broad categories, namely anthropocentric and ecocentric. The former regards nature as of value only insofar as it is useful to humanity, whereas the latter assigns intrinsic value to natural entities. Industrial society can be characterised as being dominated by anthropocentrism, which leads to the assumption that a majority of people hold anthropocentric values. However, research shows the most widely held values are ecocentric, which implies that many people's actions are at variance with their values. Furthermore, policy relating to environmental issues is predominantly anthropocentric, which implies it is failing to take account of the values of the majority. Research among experts involved in policy formulation has shown that their values, often ecocentric, are excluded from the policy process. The genetic modification of food can be categorised as anthropocentric, which implies that the technique is in conflict with widely held ecocentric values. This thesis examines data collected from interviews with individuals who have an influence on the debate surrounding the introduction of genetically modified foods, and can be considered 'experts'. Each interviewee is categorised according to whether their values and actions are ecocentric or anthropocentric, and the linkages between the two and the arguments used to justify their positions are explored. Particular emphasis is placed on interviewees who have ecocentric values but act professionally in an anthropocentric way. Finally, common themes are drawn out, and the features the arguments used by the interviewees have in common are outlined.
Resumo:
The goal of this project was to investigate the neural correlates of reading impairment in dyslexia as hypothesised by the main theories – the phonological deficit, visual magnocellular deficit and cerebellar deficit theories, with emphasis on individual differences. This research took a novel approach by: 1) contrasting the predictions in one sample of participants with dyslexia (DPs); 2) using a multiple-case study (and between-group comparisons) to investigate differences in BOLD between each DP and the controls (CPs); 3) demonstrating a possible relationship between reading impairment and its hypothesised neural correlates by using fMRI and a reading task. The multiple-case study revealed that the neural correlates of reading in dyslexia in all cases are not in agreement with the predictions of a single theory. The results show striking individual differences - even, where the neural correlates of reading in two DPs are consistent with the same theory, the areas can differ. A DP can exhibit under-engagement in an area in word, but not in pseudoword reading and vice versa, demonstrating that underactivation in that area cannot be interpreted as a ‘developmental lesion’. Additional analyses revealed complex results. Within-group analyses between behavioural measures and BOLD showed correlations in the predicted regions, areas outside ROI, and lack of correlations in some predicted areas. Comparisons of subgroups which differed on Orthography Composite supported the MDT, but only for Words. The results suggest that phonological scores are not a sufficient predictor of the under-engagement of phonological areas during reading. DPs and CPs exhibited correlations between Purdue Pegboard Composite and BOLD in cerebellar areas only for Pseudowords. Future research into reading in dyslexia should use a more holistic approach, involving genetic and environmental factors, gene by environment interaction, and comorbidity with other disorders. It is argued that multidisciplinary research, within the multiple-deficit model holds significant promise here.
Resumo:
This thesis describes the Generative Topographic Mapping (GTM) --- a non-linear latent variable model, intended for modelling continuous, intrinsically low-dimensional probability distributions, embedded in high-dimensional spaces. It can be seen as a non-linear form of principal component analysis or factor analysis. It also provides a principled alternative to the self-organizing map --- a widely established neural network model for unsupervised learning --- resolving many of its associated theoretical problems. An important, potential application of the GTM is visualization of high-dimensional data. Since the GTM is non-linear, the relationship between data and its visual representation may be far from trivial, but a better understanding of this relationship can be gained by computing the so-called magnification factor. In essence, the magnification factor relates the distances between data points, as they appear when visualized, to the actual distances between those data points. There are two principal limitations of the basic GTM model. The computational effort required will grow exponentially with the intrinsic dimensionality of the density model. However, if the intended application is visualization, this will typically not be a problem. The other limitation is the inherent structure of the GTM, which makes it most suitable for modelling moderately curved probability distributions of approximately rectangular shape. When the target distribution is very different to that, theaim of maintaining an `interpretable' structure, suitable for visualizing data, may come in conflict with the aim of providing a good density model. The fact that the GTM is a probabilistic model means that results from probability theory and statistics can be used to address problems such as model complexity. Furthermore, this framework provides solid ground for extending the GTM to wider contexts than that of this thesis.
Resumo:
PURPOSE. The purposes of the present study were to assess the effect of a sympathetic inhibitory pharmacologic agent, timolol maleate, on the magnitude of nearwork-induced transient myopia (NITM) and its decay in different refractive groups for an extended near task duration and to determine the proportion of the young adult population manifesting effective sympathetic access under naturalistic closed-loop viewing conditions. METHODS. Ten subjects with emmetropia and 10 with myopia were tested. They read binocularly for 1 hour at a distance of 35 to 40 cm. NITM was calculated as the difference in distance refractive state after task as compared with before task immediately after reading. All subjects received timolol maleate to block the sympathetic nervous system and betaxolol as a control agent in independent test sessions separated by at least 3 days. Forty minutes after drug instillation, the NITM measurement procedure was repeated. RESULTS. Initial NITM magnitude was larger in subjects with myopia than in subjects with emmetropia before and after timolol instillation. Furthermore, NITM magnitude in subjects with sympathetic access was increased after timolol instillation. In contrast, with the control agent betaxolol, there was no increase. NITM decay duration to baseline was increased after timolol instillation in the subjects with myopia only. Only 15% of the subjects (n = 3 subjects with myopia) demonstrated effective and significant access to sympathetic facility. CONCLUSIONS. Subjects with myopia demonstrated an increase in decay duration with timolol, thus suggesting impaired sympathetic inhibition of accommodation. This may be a precursor for myopia progression in some persons.
Resumo:
Models at runtime can be defined as abstract representations of a system, including its structure and behaviour, which exist in tandem with the given system during the actual execution time of that system. Furthermore, these models should be causally connected to the system being modelled, offering a reflective capability. Significant advances have been made in recent years in applying this concept, most notably in adaptive systems. In this paper we argue that a similar approach can also be used to support the dynamic generation of software artefacts at execution time. An important area where this is relevant is the generation of software mediators to tackle the crucial problem of interoperability in distributed systems. We refer to this approach as emergent middleware, representing a fundamentally new approach to resolving interoperability problems in the complex distributed systems of today. In this context, the runtime models are used to capture meta-information about the underlying networked systems that need to interoperate, including their interfaces and additional knowledge about their associated behaviour. This is supplemented by ontological information to enable semantic reasoning. This paper focuses on this novel use of models at runtime, examining in detail the nature of such runtime models coupled with consideration of the supportive algorithms and tools that extract this knowledge and use it to synthesise the appropriate emergent middleware.
Resumo:
We propose two algorithms involving the relaxation of either the given Dirichlet data or the prescribed Neumann data on the over-specified boundary, in the case of the alternating iterative algorithm of ` 12 ` 12 `$12 `&12 `#12 `^12 `_12 `%12 `~12 *Kozlov91 applied to Cauchy problems for the modified Helmholtz equation. A convergence proof of these relaxation methods is given, along with a stopping criterion. The numerical results obtained using these procedures, in conjunction with the boundary element method (BEM), show the numerical stability, convergence, consistency and computational efficiency of the proposed methods.
Resumo:
Abstract: Loss of central vision caused by age-related macular degeneration (AMD) is a problem affecting increasingly large numbers of people within the ageing population. AMD is the leading cause of blindness in the developed world, with estimates of over 600,000 people affected in the UK . Central vision loss can be devastating for the sufferer, with vision loss impacting on the ability to carry out daily activities. In particular, inability to read is linked to higher rates of depression in AMD sufferers compared to age-matched controls. Methods to improve reading ability in the presence of central vision loss will help maintain independence and quality of life for those affected. Various attempts to improve reading with central vision loss have been made. Most textual manipulations, including font size, have led to only modest gains in reading speed. Previous experimental work and theoretical arguments on spatial integrative properties of the peripheral retina suggest that ‘visual crowding’ may be a major factor contributing to inefficient reading. Crowding refers to the phenomena in which juxtaposed targets viewed eccentrically may be difficult to identify. Manipulating text spacing of reading material may be a simple method that reduces crowding and benefits reading ability in macular disease patients. In this thesis the effect of textual manipulation on reading speed was investigated, firstly for normally sighted observers using eccentric viewing, and secondly for observers with central vision loss. Test stimuli mimicked normal reading conditions by using whole sentences that required normal saccadic eye movements and observer comprehension. Preliminary measures on normally-sighted observers (n = 2) used forced-choice procedures in conjunction with the method of constant stimuli. Psychometric functions relating the proportion of correct responses to exposure time were determined for text size, font type (Lucida Sans and Times New Roman) and text spacing, with threshold exposure time (75% correct responses) used as a measure of reading performance. The results of these initial measures were used to derive an appropriate search space, in terms of text spacing, for assessing reading performance in AMD patients. The main clinical measures were completed on a group of macular disease sufferers (n=24). Firstly, high and low contrast reading acuity and critical print size were measured using modified MNREAD test charts, and secondly, the effect of word and line spacing was investigated using a new test, designed specifically for this study, called the Equal Readability Passages (ERP) test. The results from normally-sighted observers were in close agreement with those from the group of macular disease sufferers. Results show that: (i) optimum reading performance was achieved when using both double line and double word spacing; (ii) the effect of line spacing was greater than the effect of word spacing (iii) a text size of approximately 0.85o is sufficiently large for reading at 5o eccentricity. In conclusion, the results suggest that crowding is detrimental to reading with peripheral vision, and its effects can be minimized with a modest increase in text spacing.
Resumo:
Substantial altimetry datasets collected by different satellites have only become available during the past five years, but the future will bring a variety of new altimetry missions, both parallel and consecutive in time. The characteristics of each produced dataset vary with the different orbital heights and inclinations of the spacecraft, as well as with the technical properties of the radar instrument. An integral analysis of datasets with different properties offers advantages both in terms of data quantity and data quality. This thesis is concerned with the development of the means for such integral analysis, in particular for dynamic solutions in which precise orbits for the satellites are computed simultaneously. The first half of the thesis discusses the theory and numerical implementation of dynamic multi-satellite altimetry analysis. The most important aspect of this analysis is the application of dual satellite altimetry crossover points as a bi-directional tracking data type in simultaneous orbit solutions. The central problem is that the spatial and temporal distributions of the crossovers are in conflict with the time-organised nature of traditional solution methods. Their application to the adjustment of the orbits of both satellites involved in a dual crossover therefore requires several fundamental changes of the classical least-squares prediction/correction methods. The second part of the thesis applies the developed numerical techniques to the problems of precise orbit computation and gravity field adjustment, using the altimetry datasets of ERS-1 and TOPEX/Poseidon. Although the two datasets can be considered less compatible that those of planned future satellite missions, the obtained results adequately illustrate the merits of a simultaneous solution technique. In particular, the geographically correlated orbit error is partially observable from a dataset consisting of crossover differences between two sufficiently different altimetry datasets, while being unobservable from the analysis of altimetry data of both satellites individually. This error signal, which has a substantial gravity-induced component, can be employed advantageously in simultaneous solutions for the two satellites in which also the harmonic coefficients of the gravity field model are estimated.
Resumo:
The objective is to study beta-amyloid (Abeta) deposition in dementia with Lewy bodies (DLB) with Alzheimer's disease (AD) pathology (DLB/AD). The size frequency distributions of the Abeta deposits were studied and fitted by log-normal and power-law models. Patients were ten clinically and pathologically diagnosed DLB/AD cases. Size distributions had a single peak and were positively skewed and similar to those described in AD and Down's syndrome. Size distributions had smaller means in DLB/AD than in AD. Log-normal and power-law models were fitted to the size distributions of the classic and diffuse deposits, respectively. Size distributions of Abeta deposits were similar in DLB/AD and AD. Size distributions of the diffuse deposits were fitted by a power-law model suggesting that aggregation/disaggregation of Abeta was the predominant factor, whereas the classic deposits were fitted by a log-normal distribution suggesting that surface diffusion was important in the pathogenesis of the classic deposits.
Resumo:
This piece argued that the accepted orthodoxy concerning the requirement that each individual piece of property is individually segregated for a valid trust to exist is unsupported by the case law, and that there is nothing wrong in principle or theory with a trust that exists for unsegregated property.