13 resultados para methodic-didactic considerations on the work with TV news
em CentAUR: Central Archive University of Reading - UK
Resumo:
Purpose – The main aim of this paper is to present the results of a study examining managers' attitudes towards the deployment and use of information and communications technology (ICT) in their organisations. The study comes at a time when ICT is being recognised as a major enabler of innovation and new business models, which have the potential to have major impact on western economies and jobs. Design/methodology/approach – A questionnaire was specially designed to collect data relating to three research questions. The questionnaire also included a number of open-ended questions. A total of 181 managers from a wide range of industries across a number of countries participated in the electronic survey. The quantitative responses to the survey were analysed using SPSS. Exploratory factor analysis using Varimax rotation was used and ANOVA to compare responses by different groups. Findings – The survey showed that many of the respondents appeared equipped to work “any place, any time”. However, it also highlighted the challenges managers face in working in a connected operation. Also, the data suggested that many managers were less than confident about their companies' policies and practices in relation to information management. Originality/value – A next step from this exploratory research could be the development of a model exploring the impact of ICT on management and organisational performance in terms of personal characteristics of the manager, the role performed, the context and the ICT provision. Also, further research could focus on examining in more detail differences between management levels.
Resumo:
The p-nitrophenyl phosphomonoesterase assay (p NPPase) is commonly used to measure cell-wall-associated and extracellular phosphatase activity of soil fungi. p NPPases are usually assayed in the context of fungal nutrition, where inorganic P supply might be enhanced by the mineralisation of monoester organic P sources in the soil. The importance of the assay to the P nutrition of soil fungi is considered based on the evidence currently available including the consistency of methodological approach. The nature of organic P in the soil and the relevance of the assay to some specific soil substrates is discussed, particularly the chemistry and bioavailability of myo-inositol hexakisphosphate and the lower inositol phosphates. The evidence for the long-term stability of p NPPases in the soil is examined in the light of the persistence of p NPPase in soils. The role of persistent extracellular fungal p NPPases in the soil P cycle is discussed. Conclusions from p NPPase based studies must be based upon an appreciation of the constraints of the assay and the complex chemistry of organic P and p NPPase in the soil.
Resumo:
The effect of poultry species (broiler or turkey) and genotype (Wrolstad or BUT T8 turkeys and Ross 308 or Cobb 500 broilers) on the efficiency with which dietary longchain n-3 PUFA were incorporated into poultry meat was determined. Broilers and turkeys of both genotypes were fed one of six diets varying in FA composition (two replicates per genotype x diet interaction). Diets contained 50 g/kg added oil, which was either blended vegetable oil (control), or partially replaced with linseed oil (20 or 40 g/kg diet), fish oil (20 or 40 g/kg diet), or a mixture of the two (20 g linseed oil and 20 g fish oil/kg diet). Feeds and samples of skinless breast and thigh meat were analyzed for FA. Wrolstad dark meat was slightly more responsive than BUT T8 (P = 0.046) to increased dietary 18:3 concentrations (slopes of 0.570 and 0.465, respectively). The Ross 308 was also slightly more responsive than the Cobb 500 (P= 0.002) in this parameter (slopes of 0.557 and 0.449). There were no other significant differences between the genotypes. There was some evidence (based on the estimates of the slopes and their associated standard errors) that white turkey meat was more responsive than white chicken meat to 20:5 (slopes of 0.504 and 0.289 for turkeys and broilers, respectively). There was no relationship between dietary 18:3 n-3 content and meat 20:5 and 22:6 contents. If birds do convert 18:3 to higher FA, these acids are not then deposited in the edible tissues.
Resumo:
Reliably representing both horizontal cloud inhomogeneity and vertical cloud overlap is fundamentally important for the radiation budget of a general circulation model. Here, we build on the work of Part One of this two-part paper by applying a pair of parameterisations that account for horizontal inhomogeneity and vertical overlap to global re-analysis data. These are applied both together and separately in an attempt to quantify the effects of poor representation of the two components on radiation budget. Horizontal inhomogeneity is accounted for using the “Tripleclouds” scheme, which uses two regions of cloud in each layer of a gridbox as opposed to one; vertical overlap is accounted for using “exponential-random” overlap, which aligns vertically continuous cloud according to a decorrelation height. These are applied to a sample of scenes from a year of ERA-40 data. The largest radiative effect of horizontal inhomogeneity is found to be in areas of marine stratocumulus; the effect of vertical overlap is found to be fairly uniform, but with larger individual short-wave and long-wave effects in areas of deep, tropical convection. The combined effect of the two parameterisations is found to reduce the magnitude of the net top-of-atmosphere cloud radiative forcing (CRF) by 2.25 W m−2, with shifts of up to 10 W m−2 in areas of marine stratocumulus. The effects of the uncertainty in our parameterisations on radiation budget is also investigated. It is found that the uncertainty in the impact of horizontal inhomogeneity is of order ±60%, while the uncertainty in the impact of vertical overlap is much smaller. This suggests an insensitivity of the radiation budget to the exact nature of the global decorrelation height distribution derived in Part One.
Resumo:
Variations in the Atlantic Meridional Overturning Circulation (MOC) exert an important influence on climate, particularly on decadal time scales. Simulation of the MOC in coupled climate models is compromised, to a degree that is unknown, by their lack of fidelity in resolving some of the key processes involved. There is an overarching need to increase the resolution and fidelity of climate models, but also to assess how increases in resolution influence the simulation of key phenomena such as the MOC. In this study we investigate the impact of significantly increasing the (ocean and atmosphere) resolution of a coupled climate model on the simulation of MOC variability by comparing high and low resolution versions of the same model. In both versions, decadal variability of the MOC is closely linked to density anomalies that propagate from the Labrador Sea southward along the deep western boundary. We demonstrate that the MOC adjustment proceeds more rapidly in the higher resolution model due the increased speed of western boundary waves. However, the response of the Atlantic Sea Surface Temperatures (SSTs) to MOC variations is relatively robust - in pattern if not in magnitude - across the two resolutions. The MOC also excites a coupled ocean-atmosphere response in the tropical Atlantic in both model versions. In the higher resolution model, but not the lower resolution model, there is evidence of a significant response in the extratropical atmosphere over the North Atlantic 6 years after a maximum in the MOC. In both models there is evidence of a weak negative feedback on deep density anomalies in the Labrador Sea, and hence on the MOC (with a time scale of approximately ten years). Our results highlight the need for further work to understand the decadal variability of the MOC and its simulation in climate models.
Resumo:
This paper raises methodological issues about the challenges and dilemmas of inclusive research practices reflecting on the work of an advisory group carrying out research on using Information and Communication Technology (ICT) to enhance community participation. The interests of three parties can be identified – the commissioning agent, the researchers and the researched – and these interplayed throughout the course of the research determining the outcomes. Given the relationship between inclusive research and advocacy, there were particular gains in enabling the voice of the advisory group to shape the way in which the research was conducted and to disseminate the findings both within the organisation and beyond. However, the fragility of these new structures required organisational changes for the research to be truly empowering. The experience of the group suggests the need for their involvement at all stages of the research process
Resumo:
This paper presents an approximate closed form sample size formula for determining non-inferiority in active-control trials with binary data. We use the odds-ratio as the measure of the relative treatment effect, derive the sample size formula based on the score test and compare it with a second, well-known formula based on the Wald test. Both closed form formulae are compared with simulations based on the likelihood ratio test. Within the range of parameter values investigated, the score test closed form formula is reasonably accurate when non-inferiority margins are based on odds-ratios of about 0.5 or above and when the magnitude of the odds ratio under the alternative hypothesis lies between about 1 and 2.5. The accuracy generally decreases as the odds ratio under the alternative hypothesis moves upwards from 1. As the non-inferiority margin odds ratio decreases from 0.5, the score test closed form formula increasingly overestimates the sample size irrespective of the magnitude of the odds ratio under the alternative hypothesis. The Wald test closed form formula is also reasonably accurate in the cases where the score test closed form formula works well. Outside these scenarios, the Wald test closed form formula can either underestimate or overestimate the sample size, depending on the magnitude of the non-inferiority margin odds ratio and the odds ratio under the alternative hypothesis. Although neither approximation is accurate for all cases, both approaches lead to satisfactory sample size calculation for non-inferiority trials with binary data where the odds ratio is the parameter of interest.
Resumo:
Background: There is general agreement across all interested parties that a process of working together is the best way to determine which school or educational setting is right for an individual child with autism spectrum disorder. In the UK, families and local authorities both desire a constructive working relationship and see this as the best means by which to reach an agreement to determine where a child should be educated. It has been shown in published works 1 1. Batten and colleagues (Make schools make sense. Autism and education: the reality for families today; London: The National Autistic Society, 2006). View all notes that a constructive working relationship is not always achieved. Purpose: This small-scale study aims to explore the views of both parents and local authorities, focussing on how both parties perceive and experience the process of determining educational provision for children with autism spectrum disorders (ASD) within an English context. Sample, design and method: Parental opinion was gathered through the use of a questionnaire with closed and open responses. The questionnaire was distributed to two national charities, two local charities and 16 specialist schools, which offered the questionnaire to parents of children with ASD, resulting in an opportunity sample of 738 returned surveys. The views of local authority personnel from five local authorities were gathered through the use of semi-structured interviews. Data analyses included quantitative analysis of the closed response questionnaire items, and theme-based qualitative analysis of the open responses and interviews with local authority personnel. Results: In the majority of cases, parents in the survey obtained their first choice placement for their child. Despite this positive outcome, survey data indicated that parents found the process bureaucratic, stressful and time consuming. Parents tended to perceive alternative placement suggestions as financially motivated rather than in the best interests of the child. Interviews with local authority personnel showed an awareness of these concerns and the complex considerations involved in determining what is best for an individual child. Conclusions: This small-scale study highlights the need for more effective communication between parents of children with ASDs and local authority personnel at all stages of the process
Resumo:
This essay traces the development of Otto Neurath’s ideas that led to the publication of one of the first series of children’s books produced by the Isotype Institute in the late 1940s, the Visual History of Mankind. Described in its publicity material as ‘new in content’ and ‘new in method’, it embodied much of Otto Neurath’s thinking about visual education, and also coincided with other educational ideas in the UK in the 1930s and 1940s. It exemplified the Isotype Institute’s approach: teamwork, thinking about the needs of younger readers, clear explanation, and accessible content. Further, drawing on correspondence, notes and drawings from the Otto and Marie Neurath Isotype Collection at the University of Reading, the essay presents insights to the making of the books and the people involved, the costs of production and the influence of this on design decisions, and how the books were received by teachers and children.
Resumo:
We characterize the essential spectra of Toeplitz operators Ta on weighted Bergman spaces with matrix-valued symbols; in particular we deal with two classes of symbols, the Douglas algebra C+H∞ and the Zhu class Q := L∞ ∩VMO∂ . In addition, for symbols in C+H∞ , we derive a formula for the index of Ta in terms of its symbol a in the scalar-valued case, while in the matrix-valued case we indicate that the standard reduction to the scalar-valued case fails to work analogously to the Hardy space case. Mathematics subject classification (2010): 47B35,
Resumo:
We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.
Resumo:
Let λ1,…,λn be real numbers in (0,1) and p1,…,pn be points in Rd. Consider the collection of maps fj:Rd→Rd given by fj(x)=λjx+(1−λj)pj. It is a well known result that there exists a unique nonempty compact set Λ⊂Rd satisfying Λ=∪nj=1fj(Λ). Each x∈Λ has at least one coding, that is a sequence (ϵi)∞i=1 ∈{1,…,n}N that satisfies limN→∞fϵ1…fϵN(0)=x. We study the size and complexity of the set of codings of a generic x∈Λ when Λ has positive Lebesgue measure. In particular, we show that under certain natural conditions almost every x∈Λ has a continuum of codings. We also show that almost every x∈Λ has a universal coding. Our work makes no assumptions on the existence of holes in Λ and improves upon existing results when it is assumed Λ contains no holes.