878 resultados para sets of words


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Summary: There are substantial variations in the way that applicants are selected
for social work programmes in the UK and across the world. This article begins by reviewing the literature in this field, revealing debates about how effective and reliable are methods of assessment used during admission processes. It then describes a crosssectional survey of new social work applicants (n¼203) to two programme providers,describing demographic characteristics and their experiences of the admissions process.
Findings: A number of themes emerged from two sets of findings. There were variations in demographic characteristics, particularly in terms of gender and religion. The study was particularly interested in how students viewed the admissions process. Most students were satisfied with admissions processes, and there were some differences in views about the methods used. The article concludes by describing changes to the admissions system that were partly informed by the study. The article acknowledges the expected bias in the methodology, given that successful applicants were surveyed
and not those who were not successful.
Applications: The authors discuss the study findings in the context of national and international literature and suggest that more rigorous attention should be paid to such evaluations to enable this important area of education and workforce development to be better understood.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The complete sequence of the 46,267 bp genome of the lytic bacteriophage tf specific to Pseudomonas putida PpG1 has been determined. The phage genome has two sets of convergently transcribed genes and 186 bp long direct terminal repeats. The overall genomic architecture of the tf phage is similar to that of the previously described Pseudomonas aeruginosa phages PaP3, LUZ24 and phiMR299-2, and 39 out of the 72 products of predicted tf open reading frames have orthologs in these phages. Accordingly, tf was classified as belonging to the LUZ24-like bacteriophage group. However, taking into account very low homology levels between tf DNA and that of the other phages, tf should be considered as an evolutionary divergent member of the group. Two distinguishing features not reported for other members of the group were found in the tf genome. Firstly, a unique end structure - a blunt right end and a 4-nucleotide 3'-protruding left end - was observed. Secondly, 14 single-chain interruptions (nicks) were found in the top strand of the tf DNA. All nicks were mapped within a consensus sequence 5'-TACT/RTGMC-3'. Two nicks were analyzed in detail and were shown to be present in more than 90% of the phage population. Although localized nicks were previously found only in the DNA of T5-like and phiKMV-like phages, it seems increasingly likely that this enigmatic structural feature is common to various other bacteriophages.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we seek to explain variations in levels of deprivation between EU countries. The starting-point of our analysis is the finding that the relationship between income and life-style deprivation varies across countries. Given our understanding of the manner in which the income-deprivation mismatch may arise from the limitations of current income as a measure of command over resources, the pattern of variation seems to be consistent with our expectations of the variable degree to which welfare-state regimes achieve 'decommodification' and smooth income flows. This line of reasoning suggests that cross-national differences in deprivation might, in significant part, be due not only to variation in household and individual characteristics that are associated with disadvantage but also to the differential impact of such variables across countries and indeed welfare regimes. To test this hypothesis, we have taken advantage of the ECHP (European Community Household Panel) comparative data set in order to pursue a strategy of substituting variable names for country/welfare regime names. We operated with two broad categories of variables, tapping, respectively, needs and resources. Although both sets of factors contribute independently to our ability to predict deprivation, it is the resource factors that are crucial in reducing country effects. The extent of cross-national heterogeneity depends on specifying the social class and situation in relation to long-term unemployment of the household reference person. The impact of the structural socio-economic variables that we label 'resource factors' varies across countries in a manner that is broadly consistent with welfare regime theory and is the key factor in explaining cross-country differences in deprivation. As a consequence, European homogeneity is a great deal more evident among the advantaged than the disadvantaged.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper introduces a logical model of inductive generalization, and specifically of the machine learning task of inductive concept learning (ICL). We argue that some inductive processes, like ICL, can be seen as a form of defeasible reasoning. We define a consequence relation characterizing which hypotheses can be induced from given sets of examples, and study its properties, showing they correspond to a rather well-behaved non-monotonic logic. We will also show that with the addition of a preference relation on inductive theories we can characterize the inductive bias of ICL algorithms. The second part of the paper shows how this logical characterization of inductive generalization can be integrated with another form of non-monotonic reasoning (argumentation), to define a model of multiagent ICL. This integration allows two or more agents to learn, in a consistent way, both from induction and from arguments used in the communication between them. We show that the inductive theories achieved by multiagent induction plus argumentation are sound, i.e. they are precisely the same as the inductive theories built by a single agent with all data. © 2012 Elsevier B.V.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data obtained with any research tool must be reproducible, a concept referred to as reliability. Three techniques are often used to evaluate reliability of tools using continuous data in aging research: intraclass correlation coefficients (ICC), Pearson correlations, and paired t tests. These are often construed as equivalent when applied to reliability. This is not correct, and may lead researchers to select instruments based on statistics that may not reflect actual reliability. The purpose of this paper is to compare the reliability estimates produced by these three techniques and determine the preferable technique. A hypothetical dataset was produced to evaluate the reliability estimates obtained with ICC, Pearson correlations, and paired t tests in three different situations. For each situation two sets of 20 observations were created to simulate an intrarater or inter-rater paradigm, based on 20 participants with two observations per participant. Situations were designed to demonstrate good agreement, systematic bias, or substantial random measurement error. In the situation demonstrating good agreement, all three techniques supported the conclusion that the data were reliable. In the situation demonstrating systematic bias, the ICC and t test suggested the data were not reliable, whereas the Pearson correlation suggested high reliability despite the systematic discrepancy. In the situation representing substantial random measurement error where low reliability was expected, the ICC and Pearson coefficient accurately illustrated this. The t test suggested the data were reliable. The ICC is the preferred technique to measure reliability. Although there are some limitations associated with the use of this technique, they can be overcome.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Older individuals often suffer from multiple co-morbidities and are particularly vulnerable to potentially inappropriate prescribing (PIP). One method of defining instances of PIP is to use validated, evidence-based, explicit criteria. Two sets of criteria have gained international recognition: the Screening Tool of Older Persons' potentially inappropriate Prescriptions (STOPP) and Beers' criteria.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A finite element model of a single cell was created and used to investigate the effects of ageing on biophysical stimuli generated within a cell. Major cellular components were incorporated in the model: the membrane, cytoplasm, nucleus, microtubules, actin filaments, intermediate filaments, nuclear lamina, and chromatin. The model used multiple sets of tensegrity structures. Viscoelastic properties were assigned to the continuum components. To corroborate the model, a simulation of Atomic Force Microscopy (AFM) indentation was performed and results showed a force/indentation simulation with the range of experimental results.

Ageing was simulated by both increasing membrane stiffness (thereby modelling membrane peroxidation with age) and decreasing density of cytoskeletal elements (thereby modelling reduced actin density with age). Comparing normal and aged cells under indentation predicts that aged cells have a lower membrane area subjected to high strain compared to young cells, but the difference, surprisingly, is very small and would not be measurable experimentally. Ageing is predicted to have more significant effect on strain deep in the nucleus. These results show that computation of biophysical stimuli within cells are achievable with single-cell computational models whose force/displacement behaviour is within experimentally observed ranges. the models suggest only small, though possibly physiologically-significant, differences in internal biophysical stimuli between normal and aged cells.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Advances in the diagnosis and treatment of cancer has resulted in longer survival, meaning that cancer patients are now living with what may be termed a chronic type condition. As a result of this the needs of patients living with a cancer diagnosis has changed, placing a greater emphasis on survivorship which in turn has an effect on quality of life and sleep patterns. Evidence suggests that counselling and complementary therapies have a positive impact not only on the cancer patient’s quality of life but also on family members and friends.

The aim of this study was to determine if there is an improvement in client’s quality of life and sleep patterns after availing of counselling and complementary therapy services as offered by a local cancer charity.

All clients availing of the counselling or complementary therapies offered by the charity were invited to participate in a Service Evaluation. The regulations relating to research involving human participants as outlined by the “Research Governance Framework” at a local university were also adhered to. A seven piece questionnaire was used for evaluation of services.

Access to anonymous data from the cancer patients, their families and carers was granted by the Research and Development Officer within Action Cancer.
A total of 507 participants completed the initial questionnaires immediately before therapy and 255 participants completed the questionnaires immediately after therapy, the total matched sample is 230. When considering counselling and complementary therapies together (therapeutic services) there were statistically significant results indicating improved quality of life and sleep patterns between the two sets of data. However this was not the trend when considering counselling and complementary therapies alone.

While some of the findings closely reflect the literature and on the whole supports the use of therapeutic services in having a positive effect on cancer patient’s quality of life and sleep patterns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In most previous research on distributional semantics, Vector Space Models (VSMs) of words are built either from topical information (e.g., documents in which a word is present), or from syntactic/semantic types of words (e.g., dependency parse links of a word in sentences), but not both. In this paper, we explore the utility of combining these two representations to build VSM for the task of semantic composition of adjective-noun phrases. Through extensive experiments on benchmark datasets, we find that even though a type-based VSM is effective for semantic composition, it is often outperformed by a VSM built using a combination of topic- and type-based statistics. We also introduce a new evaluation task wherein we predict the composed vector representation of a phrase from the brain activity of a human subject reading that phrase. We exploit a large syntactically parsed corpus of 16 billion tokens to build our VSMs, with vectors for both phrases and words, and make them publicly available.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The design of hot-rolled steel portal frames can be sensitive to serviceability deflection limits. In such cases, in order to reduce frame deflections, practitioners increase the size of the eaves haunch and / or the sizes of the steel sections used for the column and rafter members of the frame. This paper investigates the effect of such deflection limits using a real-coded niching genetic algorithm (RC-NGA) that optimizes frame weight, taking into account both ultimate as well as serviceability limit states. The results show that the proposed GA is efficient and reliable. Two different sets of serviceability deflection limits are then considered: deflection limits recommended by the Steel Construction Institute (SCI), which is based on control of differential deflections, and other deflection limits based on suggestions by industry. Parametric studies are carried out on frames with spans ranging between 15 m to 50 m and column heights between 5 m to 10 m. It is demonstrated that for a 50 m span frame, use of the SCI recommended deflection limits can lead to frame weights that are around twice as heavy as compared to designs without these limits.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large data sets of radiocarbon dates are becoming a more common feature of archaeological research. The sheer numbers of radiocarbon dates produced, however, raise issues of representation and interpretation. This paper presents a methodology which both reduces the visible impact of dating fluctuations, but also takes into consideration the influence of the underlying radiocarbon calibration curve. By doing so, it may be possible to distinguish between periods of human activity in early medieval Ireland and the statistical tails produced by radiocarbon calibration.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper evaluates the potential of gabions as roadside safety barriers. Gabions have the capacity to blend into natural landscape, suggesting that they could be used as a safety barrier for low-volume road in scenic environments. In fact, gabions have already been used for this purpose in Nepal, but the impact response was not evaluated. This paper reports on numerical and experimental investigations performed on a new gabion barrier prototype. To assess the potential use as a roadside barrier, the optimal gabion unit size and mass were investigated using multibody analysis and four sets of 1:4 scaled crash tests were carried out to study the local vehicle-barrier interaction. The barrier prototype was then finalised and subjected to a TB31 crash test according to the European EN1317 standard for N1 safety barriers. The test resulted in a failure due to the rollover of the vehicle and tearing of the gabion mesh yielding a large working width. It was found that although the system potentially has the necessary mass to contain a vehicle, the barrier front face does not have the necessary stiffness and strength to contain the gabion stone filling and hence redirect the vehicle. In the EN1317 test, the gabion barrier acted as a ramp for the impacting vehicle, causing rollover. 

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As a comparative newly-invented PKM with over-constraints in kinematic chains, the Exechon has attracted extensive attention from the research society. Different from the well-recognized kinematics analysis, the research on the stiffness characteristics of the Exechon still remains as a challenge due to the structural complexity. In order to achieve a thorough understanding of the stiffness characteristics of the Exechon PKM, this paper proposed an analytical kinetostatic model by using the substructure synthesis technique. The whole PKM system is decomposed into a moving platform subsystem, three limb subsystems and a fixed base subsystem, which are connected to each other sequentially through corresponding joints. Each limb body is modeled as a spatial beam with a uniform cross-section constrained by two sets of lumped springs. The equilibrium equation of each individual limb assemblage is derived through finite element formulation and combined with that of the moving platform derived with Newtonian method to construct the governing kinetostatic equations of the system after introducing the deformation compatibility conditions between the moving platform and the limbs. By extracting the 6 x 6 block matrix from the inversion of the governing compliance matrix, the stiffness of the moving platform is formulated. The computation for the stiffness of the Exechon PKM at a typical configuration as well as throughout the workspace is carried out in a quick manner with a piece-by-piece partition algorithm. The numerical simulations reveal a strong position-dependency of the PKM's stiffness in that it is symmetric relative to a work plane due to structural features. At the last stage, the effects of some design variables such as structural, dimensional and stiffness parameters on system rigidity are investigated with the purpose of providing useful information for the structural optimization and performance enhancement of the Exechon PKM. It is worthy mentioning that the proposed methodology of stiffness modeling in this paper can also be applied to other overconstrained PKMs and can evaluate the global rigidity over workplace efficiently with minor revisions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mathematical modelling has become an essential tool in the design of modern catalytic systems. Emissions legislation is becoming increasingly stringent, and so mathematical models of aftertreatment systems must become more accurate in order to provide confidence that a catalyst will convert pollutants over the required range of conditions. 
Automotive catalytic converter models contain several sub-models that represent processes such as mass and heat transfer, and the rates at which the reactions proceed on the surface of the precious metal. Of these sub-models, the prediction of the surface reaction rates is by far the most challenging due to the complexity of the reaction system and the large number of gas species involved. The reaction rate sub-model uses global reaction kinetics to describe the surface reaction rate of the gas species and is based on the Langmuir Hinshelwood equation further developed by Voltz et al. [1] The reactions can be modelled using the pre-exponential and activation energies of the Arrhenius equations and the inhibition terms. 
The reaction kinetic parameters of aftertreatment models are found from experimental data, where a measured light-off curve is compared against a predicted curve produced by a mathematical model. The kinetic parameters are usually manually tuned to minimize the error between the measured and predicted data. This process is most commonly long, laborious and prone to misinterpretation due to the large number of parameters and the risk of multiple sets of parameters giving acceptable fits. Moreover, the number of coefficients increases greatly with the number of reactions. Therefore, with the growing number of reactions, the task of manually tuning the coefficients is becoming increasingly challenging. 
In the presented work, the authors have developed and implemented a multi-objective genetic algorithm to automatically optimize reaction parameters in AxiSuite®, [2] a commercial aftertreatment model. The genetic algorithm was developed and expanded from the code presented by Michalewicz et al. [3] and was linked to AxiSuite using the Simulink add-on for Matlab. 
The default kinetic values stored within the AxiSuite model were used to generate a series of light-off curves under rich conditions for a number of gas species, including CO, NO, C3H8 and C3H6. These light-off curves were used to generate an objective function. 
This objective function was used to generate a measure of fit for the kinetic parameters. The multi-objective genetic algorithm was subsequently used to search between specified limits to attempt to match the objective function. In total the pre-exponential factors and activation energies of ten reactions were simultaneously optimized. 
The results reported here demonstrate that, given accurate experimental data, the optimization algorithm is successful and robust in defining the correct kinetic parameters of a global kinetic model describing aftertreatment processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Microbial interactions depend on a range of biotic and environmental variables, and are both dynamic and unpredictable. For some purposes, and under defined conditions, it is nevertheless imperative to evaluate the inhibitory efficacy of microbes, such as those with potential as biocontrol agents. We selected six, phylogenetically diverse microbes to determine their ability to inhibit the ascomycete Fusarium
coeruleum, a soil-dwelling pathogen of potato tubers that causes the storage disease dry rot. Interaction assays, where colony development was quantified (for both fungal pathogen and potential control agents), were therefore carried out on solid media. The key parameters that contributed to, and were indicative of, inhibitory efficacy were identified as: fungal growth-rates (i) prior to contact with the biocontrol
agent and (ii) if/once contact with the biocontrol agent was established (i.e. in the zone of mixed
culture), and (iii) the ultimate distance traveled by the fungal mycelium. It was clear that there was no correlation between zones of fungal inhibition and the overall reduction in the extent of fungal colony development. An inhibition coefficient was devised which incorporated the potential contributions of distal inhibition of fungal growth-rate; prevention of mycelium development in the vicinity of the biocontrol
agent; and ability to inhibit plant-pathogen growth-rate in the zone of mixed culture (in a ratio of 2:2:1). The values derived were 84.2 for Bacillus subtilis (QST 713), 74.0 for Bacillus sp. (JC12GB42), 30.7 for Pichia anomala (J121), 19.3 for Pantoea agglomerans (JC12GB34), 13.9 for Pantoea sp. (S09:T:12), and
21.9 (indicating a promotion of fungal growth) for bacterial strain (JC12GB54). This inhibition coefficient, with a theoretical maximum of 100, was consistent with the extent of F. coeruleum-colony development (i.e. area, in cm2) and assays of these biocontrol agents carried out previously against Fusarium
spp., and other fungi. These findings are discussed in relation to the dynamics and inherent complexity of natural ecosystems, and the need to adapt models for use under specific sets of conditions.