837 resultados para Entropy of a sampling design
Resumo:
This paper highlights the key role played by solubility in influencing gelation and demonstrates that many facets of the gelation process depend on this vital parameter. In particular, we relate thermal stability (T-gel) and minimum gelation concentration (MGC) values of small-molecule gelation in terms of the solubility and cooperative self-assembly of gelator building blocks. By employing a van't Hoff analysis of solubility data, determined from simple NMR measurements, we are able to generate T-calc values that reflect the calculated temperature for complete solubilization of the networked gelator. The concentration dependence of T-calc allows the previously difficult to rationalize "plateau-region" thermal stability values to be elucidated in terms of gelator molecular design. This is demonstrated for a family of four gelators with lysine units attached to each end of an aliphatic diamine, with different peripheral groups (Z or Bee) in different locations on the periphery of the molecule. By tuning the peripheral protecting groups of the gelators, the solubility of the system is modified, which in turn controls the saturation point of the system and hence controls the concentration at which network formation takes place. We report that the critical concentration (C-crit) of gelator incorporated into the solid-phase sample-spanning network within the gel is invariant of gelator structural design. However, because some systems have higher solubilities, they are less effective gelators and require the application of higher total concentrations to achieve gelation, hence shedding light on the role of the MGC parameter in gelation. Furthermore, gelator structural design also modulates the level of cooperative self-assembly through solubility effects, as determined by applying a cooperative binding model to NMR data. Finally, the effect of gelator chemical design on the spatial organization of the networked gelator was probed by small-angle neutron and X-ray scattering (SANS/SAXS) on the native gel, and a tentative self-assembly model was proposed.
Resumo:
How a design concept was interactionally produced in the talk-in-interaction between an architect and client representatives was studied. The empirical analysis was informed by ethnomethodology and conversation analysis to observe structures and patterns of talk that accomplished actions and practices of design. Some differences were observed between the properties of the design concept in comparison with the design ideas that were considered during these conversations. The design concept was observed to be significant for assessing why some moves in a design space were considered better than others. The importance of the design concept to these interactions raised more general questions about what a design concept is and how it can be described as an object type. With reference to studies of science, technology and society these concerns were provisionally engaged with and further study of the object properties of design concepts is suggested.
Resumo:
The paper is an investigation of the exchange of ideas and information between an architect and building users in the early stages of the building design process before the design brief or any drawings have been produced. The purpose of the research is to gain insight into the type of information users exchange with architects in early design conversations and to better understand the influence the format of design interactions and interactional behaviours have on the exchange of information. We report an empirical study of pre-briefing conversations in which the overwhelming majority of the exchanges were about the functional or structural attributes of space, discussion that touched on the phenomenological, perceptual and the symbolic meanings of space were rare. We explore the contextual features of meetings and the conversational strategies taken by the architect to prompt the users for information and the influence these had on the information provided. Recommendations are made on the format and structure of pre-briefing conversations and on designers' strategies for raising the level of information provided by the user beyond the functional or structural attributes of space.
Resumo:
The Lifetime Homes (LTH) concept initiated in 1989 by the Helen Hamlyn Trust, and subsequently promoted by the Joseph Rowntree Foundation, emerged at a point when there was growing awareness of the decline of both private and public sector housing quality, especially in relation to floorspace standards (Karn & Sheridan, 1994). LTH were intended to offset the concerns of first, the house buying public of the appearance and affordability of homes suitable for successive generations, second, the private house building industry of the cost and marketability of incorporating 'inclusive' design features, and third, Registered Social Landlords (RSLs), who had to balance cost constraints with addressing the needs of a growing number of households with older and/or disabled people. Approved Document Part M of the building regulations was extended in 1999, from public buildings to private dwellings, and currently requires that all new housing meet minimal 'visitability' criteria. Indeed, although the signs are that Part M will be incrementally extended to comprise LTH principles, the paper argues that in their existing form they are insufficient to act as a key component of the government's 'new agenda for British housing'. This paper therefore explores how they might usefully be expanded from an approach, largely based on compromise, to one that inspires innovative, flexible and inclusive house forms, which also challenge design conventions.
Resumo:
This paper introduces an international collaboration of EU and Asia in education, training and research in the field of sustainable built environment, which attempts to develop a network of practical and intellectual knowledge and training exchange between Chinese and European Universities in the field of sustainable building design and construction. The projects funded by the European Commission Asia Link program, UK Foreign & Commonwealth Office, British Council and the UK Engineering Physical Sciences Council (EPSRC) have been introduced. The projects have significant impacts on promoting sustainable development in built environment in China. The aim of this paper is to share the experiences with those who are interested and searching the ways to collaborate with China in education and research.
Resumo:
Background and purpose: The paper reports a study of the perceptions of teachers in secondary schools in the Gucha district of Kenya of their own effectiveness, the structure of their self-perceptions, variations in self-perceived effectiveness and the relationship between self-perceptions of effectiveness and the examination performance of their students. Design and methods: Data were based on questionnaires completed by 109 English and mathematics teachers from a random sample of 30 schools in the Gucha district of Kenya. Pupil examination results were also collected from the schools. Results: Three dimensions of self-perceived effectiveness emerged from a factor analysis. These were: pedagogic process, personal and affective aspects of teaching and effectiveness with regard to pupil performance. Teachers tended to rate themselves relatively highly with regard to the first two, process-oriented, dimensions but less highly on the third, outcome-oriented, dimension. Self-ratings for pupil outcomes correlated with pupil examination performance at school level. Conclusions: The results show that these teachers can have a sense of themselves as competent classroom performers and educational professionals without necessarily having a strong sense of efficacy with regard to pupil outcomes.
Resumo:
The development and performance of a three-stage tubular model of the large human intestine is outlined. Each stage comprises a membrane fermenter where flow of an aqueous polyethylene glycol solution on the outside of the tubular membrane is used to control the removal of water and metabolites (principally short chain fatty acids) from, and thus the pH of, the flowing contents on the fermenter side. The three stage system gave a fair representation of conditions in the human gut. Numbers of the main bacterial groups were consistently higher than in an existing three-chemostat gut model system, suggesting the advantages of the new design in providing an environment for bacterial growth to represent the actual colonic microflora. Concentrations of short chain fatty acids and Ph levels throughout the system were similar to those associated with corresponding sections of the human colon. The model was able to achieve considerable water transfer across the membrane, although the values were not as high as those in the colon. The model thus goes some way towards a realistic simulation of the colon, although it makes no pretence to simulate the pulsating nature of the real flow. The flow conditions in each section are characterized by low Reynolds numbers: mixing due to Taylor dispersion is significant, and the implications of Taylor mixing and biofilm development for the stability, that is the ability to operate without washout, of the system are briefly analysed and discussed. It is concluded that both phenomena are important for stabilizing the model and the human colon.
Resumo:
Background: Osteoarthritis (OA) of the knee is the most prevalent joint disorder. Previous studies suggest that bromelain, a pineapple extract, may be a safer alternative/adjunctive treatment for knee OA than current conventional treatment. Aim: To assess the efficacy of bromelain in treating OA of the knee. Design: Randomized, double-blind placebo-controlled trial. Methods: Subjects (n=47) with a confirmed diagnosis of moderate to severe knee OA were randomized to 12 weeks of bromelain 800 mg/day or placebo, with a 4-week follow-up. Knee (pain, stiffness and function) and quality-of-life symptoms were reported monthly in the WOMAC and SF36 questionnaires, respectively. Adverse events were also recorded. The primary outcome measure was the change in total WOMAC score from baseline to the end of treatment at week 12. Longitudinal models were used to evaluate outcome. Results: Thirty-one patients completed the trial (14 bromelain, 17 placebo). No statistically significant differences were observed between groups for the primary outcome (coefficient 11.16, p=0.27, 95%CI-8.86 to 31.18), nor the WOMAC subscales or SF36. Both treatment groups showed clinically relevant improvement in the WOMAC disability subscale only. Adverse events were generally mild in nature. Discussion: This study suggests that bromelain is not efficacious as an adjunctive treatment of moderate to severe OA, but its limitations support the need for a follow-up study.
Resumo:
OBJECTIVES: To determine the cost-effectiveness of influenza vaccination in people aged 65-74 years in the absence of co-morbidity. DESIGN: Primary research: randomised controlled trial. SETTING: Primary care. PARTICIPANTS: People without risk factors for influenza or contraindications to vaccination were identified from 20 general practitioner (GP) practices in Liverpool in September 1999 and invited to participate in the study. There were 5875/9727 (60.4%) people aged 65-74 years identified as potentially eligible and, of these, 729 (12%) were randomised. INTERVENTION: Participants were randomised to receive either influenza vaccine or placebo (ratio 3:1), with all individuals receiving pneumococcal vaccine unless administered in the previous 10 years. Of the 729 people randomised, 552 received vaccine and 177 received placebo; 726 individuals were administered pneumococcal vaccine. MAIN OUTCOME MEASURES AND METHODOLOGY OF ECONOMIC EVALUATION: GP attendance with influenza-like illness (ILI) or pneumonia (primary outcome measure); or any respiratory symptoms; hospitalisation with a respiratory illness; death; participant self-reported ILI; quality of life (QoL) measures at 2, 4 and 6 months post-study vaccination; adverse reactions 3 days after vaccination. A cost-effectiveness analysis was undertaken to identify the incremental cost associated with the avoidance of episodes of influenza in the vaccination population and an impact model was used to extrapolate the cost-effectiveness results obtained from the trial to assess their generalisability throughout the NHS. RESULTS: In England and Wales, weekly consultations for influenza and ILI remained at baseline levels (less than 50 per 100,000 population) until week 50/1999 and then increased rapidly, peaking during week 2/2000 with a rate of 231/100,000. This rate fell within the range of 'higher than expected seasonal activity' of 200-400/100,000. Rates then quickly declined, returning to baseline levels by week 5/2000. The predominant circulating strain during this period was influenza A (H3N2). Five (0.9%) people in the vaccine group were diagnosed by their GP with an ILI compared to two (1.1%) in the placebo group [relative risk (RR), 0.8; 95% confidence interval (CI) = 0.16 to 4.1]. No participants were diagnosed with pneumonia by their GP and there were no hospitalisations for respiratory illness in either group. Significantly fewer vaccinated individuals self-reported a single ILI (4.6% vs 8.9%, RR, 0.51; 95% CI for RR, 0.28 to 0.96). There was no significant difference in any of the QoL measurements over time between the two groups. Reported systemic side-effects showed no significant differences between groups. Local side-effects occurred with a significantly increased incidence in the vaccine group (11.3% vs 5.1%, p = 0.02). Each GP consultation avoided by vaccination was estimated from trial data to generate a net NHS cost of 174 pounds. CONCLUSIONS: No difference was seen between groups for the primary outcome measure, although the trial was underpowered to demonstrate a true difference. Vaccination had no significant effect on any of the QoL measures used, although vaccinated individuals were less likely to self-report ILI. The analysis did not suggest that influenza vaccination in healthy people aged 65-74 years would lead to lower NHS costs. Future research should look at ways to maximise vaccine uptake in people at greatest risk from influenza and also the level of vaccine protection afforded to people from different age and socio-economic populations.
Resumo:
Objective: To compare the frequency of nail biting in 4 settings (interventions) designed to elicit the functions of nail biting and to compare the results with a self-report questionnaire about the functions of nail biting. Design: Randomised allocation of participants to order of conditions. Setting: University Psychology Department. Subjects: Forty undergraduates who reported biting their nails. Interventions: Left alone (boredom), solving maths problems (frustration), reprimanded for nail biting (contingent attention), continuous conversation (noncontingent attention). Main Outcome measures: Number of times the undergraduates bit their nails. Results: Nail biting occurred most often in two conditions, boredom and frustration. Conclusion: Nail biting in young adults occurs as a result of boredom or working on difficult problems, which may reflect a particular emotional state. It occurs least often when people are engaged in social interaction or when they are reprimanded for the behavior. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Objective: To explore whether patients relearning to walk after acquired brain injury and showing cognitive-motor interference were aware of divided attention difficulty; whether their perceptions concurred with those of treating staff. Design: Patients and neurophysiotherapists (from rehabilitation and disabled wards) completed questionnaires. Factor analyses were applied to responses. Correlations between responses, clinical measures and experimental decrements were examined. Results: Patient/staff responses showed some agreement; staff reported higher levels of perceived difficulty; responses conformed to two factors. One factor (staff/patients alike) reflected expectations about functional/motor status and did not correlate with decrements. The other factor (patients) correlated significantly with dual-task motor decrement, suggesting some genuine awareness of difficulty (cognitive performance prioritized over motor control). The other factor (staff) correlated significantly with cognitive decrement (gait prioritized over sustained attention). Conclusions: Despite some inaccurate estimation of susceptibility; patients and staff do exhibit awareness of divided attention difficulty, but with a limited degree of concurrence. In fact, our results suggest that patients and staff may be sensitive to different aspects of the deficit. Rather than 'Who knows best?', it is a question of 'Who knows what?.
Resumo:
The sampling of certain solid angle is a fundamental operation in realistic image synthesis, where the rendering equation describing the light propagation in closed domains is solved. Monte Carlo methods for solving the rendering equation use sampling of the solid angle subtended by unit hemisphere or unit sphere in order to perform the numerical integration of the rendering equation. In this work we consider the problem for generation of uniformly distributed random samples over hemisphere and sphere. Our aim is to construct and study the parallel sampling scheme for hemisphere and sphere. First we apply the symmetry property for partitioning of hemisphere and sphere. The domain of solid angle subtended by a hemisphere is divided into a number of equal sub-domains. Each sub-domain represents solid angle subtended by orthogonal spherical triangle with fixed vertices and computable parameters. Then we introduce two new algorithms for sampling of orthogonal spherical triangles. Both algorithms are based on a transformation of the unit square. Similarly to the Arvo's algorithm for sampling of arbitrary spherical triangle the suggested algorithms accommodate the stratified sampling. We derive the necessary transformations for the algorithms. The first sampling algorithm generates a sample by mapping of the unit square onto orthogonal spherical triangle. The second algorithm directly compute the unit radius vector of a sampling point inside to the orthogonal spherical triangle. The sampling of total hemisphere and sphere is performed in parallel for all sub-domains simultaneously by using the symmetry property of partitioning. The applicability of the corresponding parallel sampling scheme for Monte Carlo and Quasi-D/lonte Carlo solving of rendering equation is discussed.
Resumo:
Current limitations in piezoelectric and electrostatic transducers are discussed. A force-feedback electrostatic transducer capable of operating at bandwidths up to 20 kHz is described. Advantages of the proposed design are a linearised operation which simplifies the feedback control aspects and robustness of the performance characteristics to environmental perturbations. Applications in nanotechnology, optical sciences and acoustics are discussed.
Resumo:
This paper derives an efficient algorithm for constructing sparse kernel density (SKD) estimates. The algorithm first selects a very small subset of significant kernels using an orthogonal forward regression (OFR) procedure based on the D-optimality experimental design criterion. The weights of the resulting sparse kernel model are then calculated using a modified multiplicative nonnegative quadratic programming algorithm. Unlike most of the SKD estimators, the proposed D-optimality regression approach is an unsupervised construction algorithm and it does not require an empirical desired response for the kernel selection task. The strength of the D-optimality OFR is owing to the fact that the algorithm automatically selects a small subset of the most significant kernels related to the largest eigenvalues of the kernel design matrix, which counts for the most energy of the kernel training data, and this also guarantees the most accurate kernel weight estimate. The proposed method is also computationally attractive, in comparison with many existing SKD construction algorithms. Extensive numerical investigation demonstrates the ability of this regression-based approach to efficiently construct a very sparse kernel density estimate with excellent test accuracy, and our results show that the proposed method compares favourably with other existing sparse methods, in terms of test accuracy, model sparsity and complexity, for constructing kernel density estimates.
Resumo:
This paper reports on the design and manufacture of an ultra-wide (5-30µm) infrared edge filter for use in FTIR studies of the low frequency vibrational modes of metallo-proteins. We present details of the spectral design and manufacture of such a filter which meets the demanding bandwidth and transparency requirements of the application, and spectra that present the new data possible with such a filter. A design model of the filter and the materials used in its construction has been developed capable of accurately predicting spectral performance at both 300K and at the reduced operating temperature at 200K. This design model is based on the optical and semiconductor properties of a multilayer filter containing PbTe (IV-VI) layer material in combination with the dielectric dispersion of ZnSe (II-VI) deposited on a CdTe (II-VI) substrate together with the use of BaF2 (II-VII) as an antireflection layer. Comparisons between the computed spectral performance of the model and spectral measurements from manufactured coatings over a wavelength range of 4-30µm and temperature range 300-200K are presented. Finally we present the results of the FTIR measurements of Photosystem II showing the improvement in signal to noise ratio of the measurement due to using the filter, together with a light induced FTIR difference spectrum of Photosystem II.