6 resultados para Statistical Information on Recidivism (SIR)

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using prescription analyses and questionnaires, the way drug information was used by general medical practitioners during the drug adoption process was studied. Three new drugs were considered; an innovation and two 'me-too' products. The innovation was accepted by general practitioners via a contagion process, information passing among doctors. The 'me-too' preparations were accepted more slowly and by a process which did not include the contagion effect. 'Industrial' information such as direct mail was used more at the 'awareness' stage of the adoption process while 'professional' sources of information such as articles in medical journals were used more to evaluate a new product. It was shown that 'industrial' information was preferred by older single practice doctors who did not specialise, had a first degree only and who did not dispense their own prescriptions. Doctors were divided into early and late-prescribers by using the date they first prescribed the innovatory drug. Their approach to drug information sources was further studied and it was shown that the early-prescriber issued slightly more prescriptions per month, had a larger list size, read fewer journals and generally rated industrial sources of information more highly than late-prescribers. The prescribing habits of three consultant rheumatologists were analysed and compared with those of the general practitioners in the community which they served. Very little association was noted and the influence of the consultant on the prescribing habits of general practitioners was concluded to be low. The consultants influence was suggested to be of two components, active and passive; the active component being the most influential. Journal advertising and advertisement placement were studied for one of the 'me-too' drugs. It was concluded that advertisement placement should be based on the reading patterns of general practitioners and not on ad-hoc data gathered by representatives as was the present practice. A model was proposed relating the 'time to prescribe' a new drug to the variables suggested throughout this work. Four of these variables were shown to be significant. These were, the list size, the medical age of the prescriber, the number of new preparations prescribed in a given time and the number of partners in the practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, road safety and traffic congestion are major concerns worldwide. This is why research on vehicular communication is very vital. In static scenarios vehicles behave typically like in an office network where nodes transmit without moving and with no defined position. This paper analyses the impact of context information on existing popular rate adaptation algorithms. Our simulation was done in MATLAB by observing the impact of context information on these algorithms. Simulation was performed for both static and mobile cases.Our simulations are based on IEEE 802.11p wireless standard. For static scenarios vehicles do not move and without defined positions, while for the mobile case, vehicles are mobile with uniformly selected speed and randomized positions. Network performance are analysed using context information. Our results show that in mobility when context information is used, the system performance can be improved for all three rate adaptation algorithms. That can be explained by that with range checking, when many vehicles are out of communication range, less vehicles contend for network resources, thereby increasing the network performances. © 2013 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A broad based approach has been used to assess the impact of discharges to rivers from surface water sewers, with the primary objective of determining whether such discharges have a measurable impact on water quality. Three parameters, each reflecting the effects of intermittent pollution, were included in a field work programme of biological and chemical sampling and analysis which covered 47 sewer outfall sites. These parameters were the numbers and types of benthic macroinvertebrates upstream and downstream of the outfalls, the concentrations of metals in sediments, and the concentrations of metals in algae upstream and downstream of the outfalls. Information on the sewered catchments was collected from Local Authorities and by observation of the time of sampling, and includes catchment areas, land uses, evidence of connection to the foul system, and receiving water quality classification. The methods used for site selection, sampling, laboratory analysis and data analysis are fully described, and the survey results presented. Statistical and graphical analysis of the biological data, with the aid of BMWP scores, showed that there was a small but persistent fall in water quality downstream of the studied outfalls. Further analysis including the catchment information indicated that initial water quality, sewered catchment size, receiving stream size, and catchment land use were important factors in determining the impact. Finally, the survey results were used to produce guidelines for the estimation of surface water sewer discharge impacts from knowledge of the catchment characteristics, so that planning authorities can consider water quality when new drainage systems are designed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. © 2013 IOP Publishing Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.