41 resultados para Multiple-scale processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study examined the effect of range of a confidence scale on consumer knowledge calibration, specifically whether a restricted range scale (25%- 100%) leads to difference in calibration compared to a full range scale (0%-100%), for multiple-choice questions. A quasi-experimental study using student participants (N = 434) was employed. Data were collected from two samples; in the first sample (N = 167) a full range confidence scale was used, and in the second sample (N = 267) a restricted range scale was used. No differences were found between the two scales on knowledge calibration. Results from studies of knowledge calibration employing restricted range and full range confidence scales are thus comparable. © Psychological Reports 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

GraphChi is the first reported disk-based graph engine that can handle billion-scale graphs on a single PC efficiently. GraphChi is able to execute several advanced data mining, graph mining and machine learning algorithms on very large graphs. With the novel technique of parallel sliding windows (PSW) to load subgraph from disk to memory for vertices and edges updating, it can achieve data processing performance close to and even better than those of mainstream distributed graph engines. GraphChi mentioned that its memory is not effectively utilized with large dataset, which leads to suboptimal computation performances. In this paper we are motivated by the concepts of 'pin ' from TurboGraph and 'ghost' from GraphLab to propose a new memory utilization mode for GraphChi, which is called Part-in-memory mode, to improve the GraphChi algorithm performance. The main idea is to pin a fixed part of data inside the memory during the whole computing process. Part-in-memory mode is successfully implemented with only about 40 additional lines of code to the original GraphChi engine. Extensive experiments are performed with large real datasets (including Twitter graph with 1.4 billion edges). The preliminary results show that Part-in-memory mode memory management approach effectively reduces the GraphChi running time by up to 60% in PageRank algorithm. Interestingly it is found that a larger portion of data pinned in memory does not always lead to better performance in the case that the whole dataset cannot be fitted in memory. There exists an optimal portion of data which should be kept in the memory to achieve the best computational performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel framework for modelling biomolecular systems at multiple scales in space and time simultaneously is described. The atomistic molecular dynamics representation is smoothly connected with a statistical continuum hydrodynamics description. The system behaves correctly at the limits of pure molecular dynamics (hydrodynamics) and at the intermediate regimes when the atoms move partly as atomistic particles, and at the same time follow the hydrodynamic flows. The corresponding contributions are controlled by a parameter, which is defined as an arbitrary function of space and time, thus, allowing an effective separation of the atomistic 'core' and continuum 'environment'. To fill the scale gap between the atomistic and the continuum representations our special purpose computer for molecular dynamics, MDGRAPE-4, as well as GPU-based computing were used for developing the framework. These hardware developments also include interactive molecular dynamics simulations that allow intervention of the modelling through force-feedback devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Melt processing is a critical step in the manufacture of polymer articles and is even more critical when dealing with inhomogeneous polymer-clay nanocomposites systems. The chemical composition, and in particular the clay type and its organic modification, also plays a major contribution in determining the final properties and in particular the thermal and long-term oxidative stability of the resulting polymer nanocomposites. Proper selection and tuning of the process variable should, in principle, lead to improved characteristics of the fabricated product. With multiphase systems containing inorganic nanoclays, however, this is not straightforward and it is often the case that the process conditions are chosen initially to improve one or more desired properties at the expense of others. This study assesses the influence of organo-modified clays and the processing parameters (extrusion temperature and screw speed) on the rheological and morphological characteristics of polymer nanocomposites as well as on their melt and thermo-oxidative stability. Nanocomposites (PPNCs) based on PP, maleated PP and organically modified clays were prepared in different co-rotating twin-screw extruders ranging from laboratory scale to semi-industrial scale. Results show that the amount of surfactant present in similar organo-modified clays affects differently the thermo-oxidative stability of the extruded PPNCs and that changes in processing conditions affect the clay morphology too. By choosing an appropriate set of tuned process variables for the extrusion process it would be feasible to selectively fabricate polymer-clay nanocomposites, with the desired mechanical and thermo-oxidative characteristics. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The seminal multiple view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis methodology. Although seminal, these benchmark datasets are limited in scope with few reference scenes. Here, we try to take these works a step further by proposing a new multi-view stereo dataset, which is an order of magnitude larger in number of scenes and with a significant increase in diversity. Specifically, we propose a dataset containing 80 scenes of large variability. Each scene consists of 49 or 64 accurate camera positions and reference structured light scans, all acquired by a 6-axis industrial robot. To apply this dataset we propose an extension of the evaluation protocol from the Middlebury evaluation, reflecting the more complex geometry of some of our scenes. The proposed dataset is used to evaluate the state of the art multiview stereo algorithms of Tola et al., Campbell et al. and Furukawa et al. Hereby we demonstrate the usability of the dataset as well as gain insight into the workings and challenges of multi-view stereopsis. Through these experiments we empirically validate some of the central hypotheses of multi-view stereopsis, as well as determining and reaffirming some of the central challenges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In our recent work in different bioreactors up to 2.5L in scale, we have successfully cultured hMSCs using the minimum agitator speed required for complete microcarrier suspension, N JS. In addition, we also reported a scaleable protocol for the detachment from microcarriers in spinner flasks of hMSCs from two donors. The essence of the protocol is the use of a short period of intense agitation in the presence of enzymes such that the cells are detached; but once detachment is achieved, the cells are smaller than the Kolmogorov scale of turbulence and hence not damaged. Here, the same approach has been effective for culture at N JS and detachment in-situ in 15mL ambr™ bioreactors, 100mL spinner flasks and 250mL Dasgip bioreactors. In these experiments, cells from four different donors were used along with two types of microcarrier with and without surface coatings (two types), four different enzymes and three different growth media (with and without serum), a total of 22 different combinations. In all cases after detachment, the cells were shown to retain their desired quality attributes and were able to proliferate. This agitation strategy with respect to culture and harvest therefore offers a sound basis for a wide range of scales of operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In 2008, the Anticholinergic Cognitive Burden (ACB) scale was generated through a combination of laboratory data, literature review, and expert opinion. This scale identified an increased risk in mortality and worsening cognitive function in multiple populations, including 13,000 older adults in the United Kingdom. We present an updated scale based on new information and new medications available to the market. Methods: We conducted a systematic review for publications recognizing medications with adverse cognitive effects due to anti-cholinergic properties and found no new medications since 2008.Therefore we identified medications from a review of newly ap-proved medications since 2008 and medications identified throughthe clinical experience of the authors. To be included in the updatedACB scale, medications must have met the following criteria; ACBscore of 1: evidence from in vitro data that the medication has antag-onist activity at muscarinic receptors; ACB score of 2: evidence fromliterature, prescriber’s information, or expert opinion of clinical anti-cholinergic effect; ACB score of 3: evidence from literature, pre-scriber’s information, or expert opinion of the medication causingdelirium. Results: The reviewer panel included two geriatric pharmacists,one geriatric psychiatrist, one geriatrician, and one hospitalist.Twenty-three medications were eligible for review and possible inclu-sion in the updated ACB scale. Of these, seven medications were ex-cluded due to a lack of evidence for anticholinergic activity. Of the re-maining 16 medications, ten had laboratory evidence ofanticholinergic activity and added to the ACB list with a score of one.One medication was added with a score of two. Five medicationswere included in the ACB scale with a score of three.Conclusions: The revised ACB scale provides an update of med-ications with anticholinergic effects that may increase the risk of cog-nitive impairment. Future updates will be routinely conducted tomaintain an applicable library of medications for use in clinical andresearch environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Removal of dissolved salts and toxic chemicals in water, especially at a few parts per million (ppm) levels is one of the most difficult problems. There are several methods used for water purification. The choice of the method depends mainly on the level of feed water salinity, source of energy and type of contaminants present. Distillation is an age old method which can remove all types of dissolved impurities from contaminated water. In multiple effect distillation (MED) latent heat of steam is recycled several times to produce many units of distilled water with one unit of primary steam input. This is already being used in large capacity plants for treating sea water. But the challenge lies in designing a system for small scale operations that can treat a few cubic meters of water per day, especially suitable for rural communities where the available water is brackish. A small scale MED unit with an extendable number of effects has been designed and analyzed for optimum yield in terms of total distillate produced. © 2010 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.