833 resultados para collaborative reading
Resumo:
Background: Insulin sensitivity (Si) is improved by weight loss and exercise, but the effects of the replacement of saturated fatty acids (SFAs) with monounsaturated fatty acids (MUFAs) or carbohydrates of high glycemic index (HGI) or low glycemic index (LGI) are uncertain. Objective: We conducted a dietary intervention trial to study these effects in participants at risk of developing metabolic syndrome. Design: We conducted a 5-center, parallel design, randomized controlled trial [RISCK (Reading, Imperial, Surrey, Cambridge, and Kings)]. The primary and secondary outcomes were changes in Si (measured by using an intravenous glucose tolerance test) and cardiovascular risk factors. Measurements were made after 4 wk of a high-SFA and HGI (HS/HGI) diet and after a 24-wk intervention with HS/HGI (reference), high-MUFA and HGI (HM/HGI), HM and LGI (HM/LGI), low-fat and HGI (LF/HGI), and LF and LGI (LF/LGI) diets. Results: We analyzed data for 548 of 720 participants who were randomly assigned to treatment. The median Si was 2.7 × 10−4 mL · μU−1 · min−1 (interquartile range: 2.0, 4.2 × 10−4 mL · μU−1 · min−1), and unadjusted mean percentage changes (95% CIs) after 24 wk treatment (P = 0.13) were as follows: for the HS/HGI group, −4% (−12.7%, 5.3%); for the HM/HGI group, 2.1% (−5.8%, 10.7%); for the HM/LGI group, −3.5% (−10.6%, 4.3%); for the LF/HGI group, −8.6% (−15.4%, −1.1%); and for the LF/LGI group, 9.9% (2.4%, 18.0%). Total cholesterol (TC), LDL cholesterol, and apolipoprotein B concentrations decreased with SFA reduction. Decreases in TC and LDL-cholesterol concentrations were greater with LGI. Fat reduction lowered HDL cholesterol and apolipoprotein A1 and B concentrations. Conclusions: This study did not support the hypothesis that isoenergetic replacement of SFAs with MUFAs or carbohydrates has a favorable effect on Si. Lowering GI enhanced reductions in TC and LDL-cholesterol concentrations in subjects, with tentative evidence of improvements in Si in the LF-treatment group. This trial was registered at clinicaltrials.gov as ISRCTN29111298.
Resumo:
The study examined: (a) the role of phonological, grammatical, and rapid automatized naming (RAN) skills in reading and spelling development; and (b) the component processes of early narrative writing skills. Fifty-seven Turkish-speaking children were followed from Grade 1 to Grade 2. RAN was the most powerful longitudinal predictor of reading speed and its effect was evident even when previous reading skills were taken into account. Broadly, the phonological and grammatical skills made reliable contributions to spelling performance but their effects were completely mediated by previous spelling skills. Different aspects of the narrative writing skills were related to different processing skills. While handwriting speed predicted writing fluency, spelling accuracy predicted spelling error rate. Vocabulary and working memory were the only reliable longitudinal predictors of the quality of composition content. The overall model, however, failed to explain any reliable variance in the structural quality of the compositions
Resumo:
We describe a compositional framework, together with its supporting toolset, for hardware/software co-design. Our framework is an integration of a formal approach within a traditional design flow. The formal approach is based on Interval Temporal Logic and its executable subset, Tempura. Refinement is the key element in our framework because it will derive from a single formal specification of the system the software and hardware parts of the implementation, while preserving all properties of the system specification. During refinement simulation is used to choose the appropriate refinement rules, which are applied automatically in the HOL system. The framework is illustrated with two case studies. The work presented is part of a UK collaborative research project between the Software Technology Research Laboratory at the De Montfort University and the Oxford University Computing Laboratory.
Resumo:
We consider problems of splitting and connectivity augmentation in hypergraphs. In a hypergraph G = (V +s, E), to split two edges su, sv, is to replace them with a single edge uv. We are interested in doing this in such a way as to preserve a defined level of connectivity in V . The splitting technique is often used as a way of adding new edges into a graph or hypergraph, so as to augment the connectivity to some prescribed level. We begin by providing a short history of work done in this area. Then several preliminary results are given in a general form so that they may be used to tackle several problems. We then analyse the hypergraphs G = (V + s, E) for which there is no split preserving the local-edge-connectivity present in V. We provide two structural theorems, one of which implies a slight extension to Mader’s classical splitting theorem. We also provide a characterisation of the hypergraphs for which there is no such “good” split and a splitting result concerned with a specialisation of the local-connectivity function. We then use our splitting results to provide an upper bound on the smallest number of size-two edges we must add to any given hypergraph to ensure that in the resulting hypergraph we have λ(x, y) ≥ r(x, y) for all x, y in V, where r is an integer valued, symmetric requirement function on V*V. This is the so called “local-edge-connectivity augmentation problem” for hypergraphs. We also provide an extension to a Theorem of Szigeti, about augmenting to satisfy a requirement r, but using hyperedges. Next, in a result born of collaborative work with Zoltán Király from Budapest, we show that the local-connectivity augmentation problem is NP-complete for hypergraphs. Lastly we concern ourselves with an augmentation problem that includes a locational constraint. The premise is that we are given a hypergraph H = (V,E) with a bipartition P = {P1, P2} of V and asked to augment it with size-two edges, so that the result is k-edge-connected, and has no new edge contained in some P(i). We consider the splitting technique and describe the obstacles that prevent us forming “good” splits. From this we deduce results about which hypergraphs have a complete Pk-split. This leads to a minimax result on the optimal number of edges required and a polynomial algorithm to provide an optimal augmentation.
Resumo:
Collected papers of the University of Reading Stenton Symposium, 2008.
Resumo:
This essay explores how The Truman Show, Peter Weir’s film about a television show, deserves more sustained analysis than it has received since its release in 1998. I will argue that The Truman Show problematizes the binary oppositions of cinema/television, disruption/stability, reality/simulation and outside/inside that structure it. The Truman Show proposes that binary oppositions such as outside/inside exist in a mutually implicating relationship. This deconstructionist strategy not only questions the film’s critical position, but also enables a reflection on the very status of film analysis itself.
Resumo:
Typeface design: collaborative work commissioned by Adobe Inc. Published but unreleased. The Adobe Devanagari typefaces were commissioned from Tiro Typeworks and collaboratively designed by Tim Holloway, Fiona Ross and John Hudson, beginning in 2005. The types were officially released in 2009. The design brief was to produce a typeface for modern business communications in Hindi and other languages, to be legible both in print and on screen. Adobe Devanagari was designed to be highly readable in a range of situations including quite small sizes in spreadsheets and in continuous text setting, as well as at display sizes, where the full character of the typeface reveals itself. The construction of the letters is based on traditional penmanship but possesses less stroke contrast than many Devanagari types, in order to maintain strong, legible forms at smaller sizes. To achieve a dynamic, fluid style the design features a rounded treatment of distinguishing terminals and stroke reversals, open counters that also aid legibility at smaller sizes, and delicately flaring strokes. Together, these details reveal an original hand and provide a contemporary approach that is clean, clear and comfortable to read whether in short or long passages of text. This new approach to a traditional script is intended to counter the dominance of rigid, staccato-like effects of straight verticals and horizontals in earlier types and many existing fonts. OpenType Layout features in the fonts provide both automated and discretionary access to an extensive glyph set, enabling sophisticated typography. Many conjuncts preferred in classical literary texts and particularly in some North Indian languages are included; these literary conjuncts may be substituted by specially designed alternative linear forms and fitted half forms. The length of the ikars—ि and ी—varies automatically according to adjacent letter or conjunct width. Regional variants of characters and numerals (e.g. Marathi forms) are included as alternates. Careful attention has been given to the placements of all vowel signs and modifiers. The fonts include both proportional and tabular numerals in Indian and European styles. Extensive kerning covers several thousand possible combinations of half forms and full forms to anticipate arbitrary conjuncts in foreign loan words. _____
Resumo:
This paper describes the design and implementation of an agent based network for the support of collaborative switching tasks within the control room environment of the National Grid Company plc. This work includes aspects from several research disciplines, including operational analysis, human computer interaction, finite state modelling techniques, intelligent agents and computer supported co-operative work. Aspects of these procedures have been used in the analysis of collaborative tasks to produce distributed local models for all involved users. These models have been used as the basis for the production of local finite state automata. These automata have then been embedded within an agent network together with behavioural information extracted from the task and user analysis phase. The resulting support system is capable of task and communication management within the transmission despatch environment.