884 resultados para high rate


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We inferred phylogeny among the three major lineages of the Acari ( mites) from the small subunit rRNA gene. Our phylogeny indicates that the Opilioacariformes is the sister-group to the Ixodida+Holothyrida, not the Ixodida+Mesostigmata+Holothyrida, as previously thought. Support for this relationship increased when sites with the highest rates of nucleotide substitution, and thus the greatest potential for saturation with nucleotide substitutions, were removed. Indeed, the increase in support ( and resolution) was despite a 70% reduction in the number of parsimony-informative sites from 408 to 115. This shows that rather than 'noisy' sites having no impact on resolution of deep branches, 'noisy' sites have the potential to obscure phylogenetic relationships. The arrangement, Ixodida+Holothyrida+Opilioacariformes, however, may be an artefact of long-branch attraction since relative-rate tests showed that the Mesostigmata have significantly faster rates of nucleotide substitution than other parasitiform mites. Thus, the fast rates of nucleotide substitution of the Mesostigmata might have caused the Mesostigmata to be attracted to the outgroup in our trees. We tested the hypothesis that the high rate of nucleotide substitution in some mites was related to their short generation times. The Acari species that have high nucleotide substitution rates usually have short generation times; these mites also tend to be more active and thus have higher metabolic rates than other mites. Therefore, more than one factor may affect the rate of nucleotide substitution in these mites.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Separate treatment of dewatering liquor from anaerobic sludge digestion significantly reduces the nitrogen load of the main stream and improves overall nitrogen elimination. Such ammonium-rich wastewater is particularly suited to be treated by high rate processes which achieve a rapid elimination of nitrogen with a minimal COD requirement. Processes whereby ammonium is oxidised to nitrite only (nitritation) followed by denitritation with carbon addition can achieve this. Nitrogen removal by nitritation/denitritation was optimised using a novel SBR operation with continuous dewatering liquor addition. Efficient and robust nitrogen elimination was obtained at a total hydraulic retention time of 1 day via the nitrite pathway. Around 85-90% nitrogen removal was achieved at an ammonium loading rate of 1.2 g NH4+-N m(-3) d(-1). Ethanol was used as electron donor for denitritation at a ratio of 2.2gCODg(-1) N removed. Conventional nitritation/denitritation with rapid addition of the dewatering liquor at the beginning of the cycle often resulted in considerable nitric oxide (NO) accumulation during the anoxic phase possibly leading to unstable denitritation. Some NO production was still observed in the novel continuous mode, but denitritation was never seriously affected. Thus, process stability can be increased and the high specific reaction rates as well as the continuous feeding result in decreased reactor size for full-scale operation. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report the successful RAFT-mediated emulsion polymerization of styrene using a non-ionic surfactant (Brij98), the highly reactive 1-phenylethyl phenyldithioacetate (PEPDTA) RAFT agent, and water-soluble initiator ammonium persulfate (APS). The molar ratio of RAFT agent to APS was identical in all experiments. Most of the monomer was contained within the micelles, analogous to microemulsion or miniemulsion systems but without the need of shear, sonication, cosurfactant, or a hydrophobe. The number-average molecular weight increased with conversion and the polydispersity index was below 1.2. This ideal 'living' behavior was only found when molecular weights of 9000 and below were targeted. It was postulated that the rapid transportation of RAFT agent from the monomer swollen micelles to the growing particles was fast on the polymerization timescale, and most if not all the RAFT agent is consumed within the first 10% conversion. In addition, it was postulated that the high nucleation rate from the high rate of exit ( of the R radical from the RAFT agent) and high entry rate from water-phase radicals ( high APS concentration) reduced the effects of 'superswelling' and therefore a similar molar ratio of RAFT agent to monomer was maintained in all growing particles. The high polydispersity indexes found when targeting molecular weights greater than 9000 were postulated to be due to the lower nucleation rate from the lower weight fractions of both APS and RAFT agent. In these cases, 'superswelling' played a dominant role leading to a heterogeneous distribution of RAFT to monomer ratios among the particles nucleated at different times.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

SQL (Structured Query Language) is one of the essential topics in foundation databases courses in higher education. Due to its apparent simple syntax, learning to use the full power of SQL can be a very difficult activity. In this paper, we introduce SQLator, which is a web-based interactive tool for learning SQL. SQLator's key function is the evaluate function, which allows a user to evaluate the correctness of his/her query formulation. The evaluate engine is based on complex heuristic algorithms. The tool also provides instructors the facility to create and populate database schemas with an associated pool of SQL queries. Currently it hosts two databases with a query pool of 300+ across the two databases. The pool is divided into 3 categories according to query complexity. The SQLator user can perform unlimited executions and evaluations on query formulations and/or view the solutions. The SQLator evaluate function has a high rate of success in evaluating the user's statement as correct (or incorrect) corresponding to the question. We will present in this paper, the basic architecture and functions of SQLator. We will further discuss the value of SQLator as an educational technology and report on educational outcomes based on studies conducted at the School of Information Technology and Electrical Engineering, The University of Queensland.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Evidence supporting the efficacy of physical activity promotion in primary care settings has evaluated patient-level changes in physical activity, with little focus on the issue of general practitioner (GP) uptake. The 'GP Strategy' of 10,000 Steps Rockhampton provided an opportunity to explore this issue in the context of a multi-strategy, community-based physical activity intervention project. The 'GP Strategy' was developed in partnership with the Capricornia Division of General Practice. It aimed to: 1) increase GP awareness of the 10,000 Steps project, 2) upskill GPs in brief physical activity counselling techniques, and 3) provide GPs with evidencebased physical activity counselling materials and pedometers. The evaluation, which was guided by the RE-AIM evaluation framework, used a pre-post design, including a GP mailed survey, and collection of process data. Survey response rates were 67% (n=44/66; baseline) and 70% (n=37/53; 14-month follow-up). GP awareness of 10,000 Steps Rockhampton increased from 46% to 97%. 21/23 practices were visited by 10,000 Steps staff and accepted 10,000 Steps posters, brochures, and pedometers. At follow-up, 78% had displayed the poster, 81% were using the brochures, and 70% had loaned pedometers to patients. Despite the very high rate of uptake and use of 10,000 Steps materials, there was no change in the percentage of patients counselled, and relatively few pedometers had been loaned to patients. The results of this trial indicate that it will take more effort to change GP physical activity counselling behaviour, and provide only modest support for use of pedometers in the busy general practice setting. Acknowledgement:This project is supported by a grant from Health Promotion Queensland.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Medicines reconciliation-identifying and maintaining an accurate list of a patient's current medications-should be undertaken at all transitions of care and available to all patients. Objective A self-completion web survey was conducted for chief pharmacists (or equivalent) to evaluate medicines reconciliation levels in secondary care mental health organisations. Setting The survey was sent to secondary care mental health organisations in England, Scotland, Northern Ireland and Wales. Method The survey was launched via Bristol Online Surveys. Quantitative data was analysed using descriptive statistics and qualitative data was collected through respondents free-text answers to specific questions. Main outcomes measure Investigate how medicines reconciliation is delivered, incorporate a clear description of the role of pharmacy staff and identify areas of concern. Results Forty-two (52 % response rate) surveys were completed. Thirty-seven (88.1 %) organisations have a formal policy for medicines reconciliation with defined steps. Results show that the pharmacy team (pharmacists and pharmacy technicians) are the main professionals involved in medicines reconciliation with a high rate of doctors also involved. Training procedures frequently include an induction by pharmacy for doctors whilst the pharmacy team are generally trained by another member of pharmacy. Mental health organisations estimate that nearly 80 % of medicines reconciliation is carried out within 24 h of admission. A full medicines reconciliation is not carried out on patient transfer between mental health wards; instead quicker and less exhaustive variations are implemented. 71.4 % of organisations estimate that pharmacy staff conduct daily medicine reconciliations for acute admission wards (Monday to Friday). However, only 38 % of organisations self-report to pharmacy reconciling patients' medication for other teams that admit from primary care. Conclusion Most mental health organisations appear to be complying with NICE guidance on medicines reconciliation for their acute admission wards. However, medicines reconciliation is conducted less frequently on other units that admit from primary care and rarely completed on transfer when it significantly differs to that on admission. Formal training and competency assessments on medicines reconciliation should be considered as current training varies and adherence to best practice is questionable.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Developmental learning disabilities such as dyslexia and dyscalculia have a high rate of co-occurrence in pediatric populations, suggesting that they share underlying cognitive and neurophysiological mechanisms. Dyslexia and other developmental disorders with a strong heritable component have been associated with reduced sensitivity to coherent motion stimuli, an index of visual temporal processing on a millisecond time-scale. Here we examined whether deficits in sensitivity to visual motion are evident in children who have poor mathematics skills relative to other children of the same age. We obtained psychophysical thresholds for visual coherent motion and a control task from two groups of children who differed in their performance on a test of mathematics achievement. Children with math skills in the lowest 10% in their cohort were less sensitive than age-matched controls to coherent motion, but they had statistically equivalent thresholds to controls on a coherent form control measure. Children with mathematics difficulties therefore tend to present a similar pattern of visual processing deficit to those that have been reported previously in other developmental disorders. We speculate that reduced sensitivity to temporally defined stimuli such as coherent motion represents a common processing deficit apparent across a range of commonly co-occurring developmental disorders.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The suitability of a new plastic supporting medium for biofiltration was tested over a three year period. Tests were carried out on the stability, surface properties, mechanical strength, and dimensions of the medium. There was no evidence to suggest that the medium was deficient in any of these respects. The specific surface (320m2m-3) and the voidage (94%) of the new medium are unlike any other used in bio-filtration and a pilot plant containing two filters was built to observe its effects on ecology and performance. Performance was estimated by chemical analysis and ecology studied by film examination and fauna counts. A system of removable sampling baskets was designed to enable samples to be obtained from two intermediate depths of filter. One of the major operating problems of percolating filters is excessive accumulation of film. The amount of film is influenced by hydraulic and organic load and each filter was run at a different loading. One was operated at 1.2m3m-3day-1 (DOD load 0.24kgm-3day-1) judged at the time to be the lowest filtration rate to offer advantages over conventional media. The other filter was operated at more than twice this loading (2.4m3m-3day-lBOD load 0.55kgm-3day-1) giving a roughly 2.5x and 6x the conventional loadings recommended for a Royal Commission effluent. The amount of film in each filter was normally low (0.05-3kgm(3 as volatile solids) and did not affect efficiency. The evidence collected during the study indicated that the ecology of the filters was normal when compared with the data obtained from the literature relating to filters with mineral media. There were indications that full ecological stability was yet to be reached and this was affecting the efficiency of the filters. The lower rate filter produced an average 87% BOD removal giving a consistent Royal Commission effluent during the summer months. The higher rate filter produced a mean 83% BOD removal but at no stage a consistent Royal Commission effluent. From the data on ecology and performance the filters resembled conventional filters rather than high rate filters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The first demonstration of a hollow core photonic bandgap fiber suitable for high-rate data transmission at 2µm is presented. Using a custom built Thulium doped fiber amplifier, error-free 8Gbit/s transmission in an optically amplified data channel at 2008nm is reported for the first time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The first demonstration of a hollow core photonic bandgap fiber (HC-PBGF) suitable for high-rate data transmission in the 2 μm waveband is presented. The fiber has a record low loss for this wavelength region (4.5 dB/km at 1980 nm) and a >150 nm wide surface-mode-free transmission window at the center of the bandgap. Detailed analysis of the optical modes and their propagation along the fiber, carried out using a time-of-flight technique in conjunction with spatially and spectrally resolved (S) imaging, provides clear evidence that the HC-PBGF can be operated as quasi-single mode even though it supports up to four mode groups. Through the use of a custom built Thulium doped fiber amplifier with gain bandwidth closely matched to the fiber's low loss window, error-free 8 Gbit/s transmission in an optically amplified data channel at 2008 nm over 290 m of 19 cell HC-PBGF is reported. © 2013 Optical Society of America.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

After a proliferation of logistics e-Marketplaces during the dot.com boom of 1998-2000, there has been a high rate of failure and survivals are developing much more slowly than expected. This is the case in the aviation industry where a large number of B2B e-Marketplaces emerged according to the focus of aviation companies’ strategies on electronic B2B in the late 1990s. However, the current use of e-Marketplaces in the industry is low and many of them have ceased trading. The traditional e-Marketplaces model has been characterised by poor quality portals and a lack of technical standards. Such an approach is unsustainable in today’s competitive scenario. Improvements in website quality attributes may strongly contribute to the simplification of website functionality by users and speed up communication with all supply chain partners. In this context, it appears critical to develop models for the evaluation of e-Marketplace web sites. This chapter, after a discussion about the development of e-Marketplaces in the transport and logistics service industry and its application in the aviation industry, proposes a multi-criteria model for assessing different types of aeronautic B2B e-Marketplaces.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Macrophages play important roles in the clearance of dying and dead cells. Typically, and perhaps simplistically, they are viewed as the professional phagocytes of apoptotic cells. Clearance by macrophages of cells undergoing apoptosis is a non-phlogistic phenomenon which is often associated with actively anti-inflammatory phagocyte responses. By contrast, macrophage responses to necrotic cells, including secondarily necrotic cells derived from uncleared apoptotic cells, are perceived as proinflammatory. Indeed, persistence of apoptotic cells as a result of defective apoptotic-cell clearance has been found to be associated with the pathogenesis of autoimmune disease. Here we review the mechanisms by which macrophages interact with, and respond to, apoptotic cells. We suggest that macrophages are especially important in clearing cells at sites of histologically visible, high-rate apoptosis and that, otherwise, apoptotic cells are removed largely by non-macrophage neighbours. We challenge the view that necrotic cells, including persistent apoptotic cells are, of necessity, proinflammatory and immunostimulatory and suggest that, under appropriate circumstances, persistent apoptotic cells can provide a prolonged anti-inflammatory stimulus.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Research on aphasia has struggled to identify apraxia of speech (AoS) as an independent deficit affecting a processing level separate from phonological assembly and motor implementation. This is because AoS is characterized by both phonological and phonetic errors and, therefore, can be interpreted as a combination of deficits at the phonological and the motoric level rather than as an independent impairment. We apply novel psycholinguistic analyses to the perceptually phonological errors made by 24 Italian aphasic patients. We show that only patients with relative high rate (>10%) of phonetic errors make sound errors which simplify the phonology of the target. Moreover, simplifications are strongly associated with other variables indicative of articulatory difficulties - such as a predominance of errors on consonants rather than vowels -but not with other measures - such as rate of words reproduced correctly or rates of lexical errors. These results indicate that sound errors cannot arise at a single phonological level because they are different in different patients. Instead, different patterns: (1) provide evidence for separate impairments and the existence of a level of articulatory planning/programming intermediate between phonological selection and motor implementation; (2) validate AoS as an independent impairment at this level, characterized by phonetic errors and phonological simplifications; (3) support the claim that linguistic principles of complexity have an articulatory basis since they only apply in patients with associated articulatory difficulties.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In community college nursing programs the high rate of attrition was a major concern to faculty and administrators. Since first semester attrition could lead to permanent loss of students and low retention in nursing programs, it was important to identify at-risk students early and develop proactive approaches to assist them to be successful. The goal of nursing programs was to graduate students who were eligible to take the national council licensing examination (RN). This was especially important during a time of critical shortage in the nursing workforce. ^ This study took place at a large, multi-campus community college, and used Tinto's (1975) Student Integration Model of persistence as the framework. A correlational study was conducted to determine whether the independent variables, past academic achievement, English proficiency, achievement tendency, weekly hours of employment and financial resources, could discriminate between the two grade groups, pass and not pass. Establishing the relationship between the selected variables and successful course completion might be used to reduce attrition and improve retention. Three research instruments were used to collect data. A Demographic Information form developed by the researcher was used to obtain academic data, the research questionnaire Measure of Achieving Tendency measured achievement motivation, and the Test of Adult Basic Education (TABE), Form 8, Level A, Tests 1, 4, and 5 measured the level of English proficiency. The Department of Nursing academic policy, requiring a minimum course grade of “C” or better was used to determine the final course outcome. A stepwise discriminant analysis procedure indicated that college language level and pre-semester grade point average were significant predictors of final course outcome. ^ Based on the findings of the study recommendations focused on assessing students' English proficiency prior to admission into the nursing program, an intensive remediation plan in language comprehension for at-risk students, and the selection of alternate textbooks and readings that more closely matched the English proficiency level of the students. A pilot study should be conducted to investigate the benefit of raising the admission grade point average. ^