87 resultados para Time Based Media
Resumo:
Objective: To test the feasibility of an evidence-based clinical literature search service to help answer general practitioners' (GPs') clinical questions. Design: Two search services supplied GPs who submitted questions with the best available empirical evidence to answer these questions. The GPs provided feedback on the value of the service, and concordance of answers from the two search services was assessed. Setting: Two literature search services (Queensland and Victoria), operating for nine months from February 1999. Main outcome measures: Use of the service; time taken to locate answers; availability of evidence; value of the service to GPs; and consistency of answers from the two services. Results: 58 GPs asked 160 questions (29 asked one, 11 asked five or more). The questions concerned treatment (65%), aetiology (17%), prognosis (13%), and diagnosis (5%). Answering a question took a mean of 3 hours 32 minutes of personnel time (95% Cl, 2.67-3.97); nine questions took longer than 10 hours each to answer, the longest taking 23 hours 30 minutes. Evidence of suitable quality to provide a sound answer was available for 126 (79%) questions. Feedback data for 84 (53%) questions, provided by 42 GPs, showed that they appreciated the service, and asking the questions changed clinical care. There were many minor differences between the answers from the two centres, and substantial differences in the evidence found for 4/14 questions. However, conclusions reached were largely similar, with no or only minor differences for all questions. Conclusions: It is feasible to provide a literature search service, but further assessment is needed to establish its cost effectiveness.
Resumo:
General practitioners wanting to practise evidence-based medicine (EBM) are constrained by time factors and the great diversity of clinical problems they deal with. They need experience in knowing what questions to ask, in locating and evaluating the evidence, and in applying it. Conventional searching for the best evidence can be achieved in daily general practice. Sometimes the search can be performed during the consultation, but more often it can be done later and the patient can return for the result. Case-based journal clubs provide a supportive environment for GPs to work together to find the best evidence at regular meetings. An evidence-based literature search service is being piloted to enhance decision-making for individual patients. A central facility provides the search and interprets the evidence in relation to individual cases. A request form and a results format make the service akin to pathology testing or imaging. Using EBM in general practice appears feasible. Major difficulties still exist before it can be practised by all GPs, but it has the potential to change the way doctors update their knowledge.
Resumo:
Incremental parsing has long been recognized as a technique of great utility in the construction of language-based editors, and correspondingly, the area currently enjoys a mature theory. Unfortunately, many practical considerations have been largely overlooked in previously published algorithms. Many user requirements for an editing system necessarily impact on the design of its incremental parser, but most approaches focus only on one: response time. This paper details an incremental parser based on LR parsing techniques and designed for use in a modeless syntax recognition editor. The nature of this editor places significant demands on the structure and quality of the document representation it uses, and hence, on the parser. The strategy presented here is novel in that both the parser and the representation it constructs are tolerant of the inevitable and frequent syntax errors that arise during editing. This is achieved by a method that differs from conventional error repair techniques, and that is more appropriate for use in an interactive context. Furthermore, the parser aims to minimize disturbance to this representation, not only to ensure other system components can operate incrementally, but also to avoid unfortunate consequences for certain user-oriented services. The algorithm is augmented with a limited form of predictive tree-building, and a technique is presented for the determination of valid symbols for menu-based insertion. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
A new method is presented to determine an accurate eigendecomposition of difficult low temperature unimolecular master equation problems. Based on a generalisation of the Nesbet method, the new method is capable of achieving complete spectral resolution of the master equation matrix with relative accuracy in the eigenvectors. The method is applied to a test case of the decomposition of ethane at 300 K from a microcanonical initial population with energy transfer modelled by both Ergodic Collision Theory and the exponential-down model. The fact that quadruple precision (16-byte) arithmetic is required irrespective of the eigensolution method used is demonstrated. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
In this study we present a novel automated strategy for predicting infarct evolution, based on MR diffusion and perfusion images acquired in the acute stage of stroke. The validity of this methodology was tested on novel patient data including data acquired from an independent stroke clinic. Regions-of-interest (ROIs) defining the initial diffusion lesion and tissue with abnormal hemodynamic function as defined by the mean transit time (MTT) abnormality were automatically extracted from DWI/PI maps. Quantitative measures of cerebral blood flow (CBF) and volume (CBV) along with ratio measures defined relative to the contralateral hemisphere (r(a)CBF and r(a)CBV) were calculated for the MTT ROIs. A parametric normal classifier algorithm incorporating these measures was used to predict infarct growth. The mean r(a)CBF and r(a)CBV values for eventually infarcted MTT tissue were 0.70 +/-0.19 and 1.20 +/-0.36. For recovered tissue the mean values were 0.99 +/-0.25 and 1.87 +/-0.71, respectively. There was a significant difference between these two regions for both measures (P
Resumo:
Objective: To determine the risk of conductive hearing loss in preterm infants with bronchopulmonary dysplasia (BPD) and preterm controls. Methodology: The study population consisted of 78 infants with BPD of 26-33 weeks gestation and 78 controls of similar gestational age matched for broad-based birthweight categories. An auditory brainstem response (ABR) audiology was performed shortly before hospital discharge. Visual reinforcement orientation audiometry (VROA) and impedance audiometry were performed at 8-12 months corrected for prematurity. Infants with persistent audiological abnormalities were referred for evaluation to paediatric ENT surgeons. Results: Infants with BPD had a significantly higher rate of ABR abnormalities (BPD: 22%, controls: 9%; P = 0.028). On VROA and impedance audiometry, the infants with BPD also had a higher rate of persistent abnormalities. Following ENT assessment, 22.1% of infants with BPD and 7.7% of controls had persistent conductive dysfunction requiring myringotomy and grommet tube insertion (P = 0.03). Most of these infants had normal ABR audiometry at hospital discharge. Conclusions: Preterm infants with BPD are at high risk of persistent conductive hearing loss late in the first year of life compared to controls. An ABR audiology conducted at the time of hospital discharge does not predict accurately later conductive hearing problems. Infants with BPD should have routine audiological evaluation toward the end of the first year of life.
Resumo:
Penalizing line management for the occurrence of lost time injuries has in some cases had unintended negative consequences. These are discussed. An alternative system is suggested that penalizes line management for accidents where the combination of the probability of recurrence and the maximum reasonable consequences such a recurrence may have exceeds an agreed limit. A reward is given for prompt effective control of the risk to below the agreed risk limit. The reward is smaller than the penalty. High-risk accidents require independent investigation by a safety officer using analytical techniques. Two case examples are given to illustrate the system. Continuous safety improvement is driven by a planned reduction in the agreed risk limit over time and reward for proactive risk assessment and control.
Resumo:
The detection of Neisseria gonorrhoeae by the polymerase chain reaction (PCR) is now recognized as a sensitive and specific method of diagnosing infection by the organism. In this Study 152 urine specimens were examined for N. gonorrhoeae by a real-time PCR method using the LightCycler platform and results were compared to an in-house PCR assay using an ELISA-based detection method. N. gonorrhoeae DNA was detected in 29 (19%) specimens by LightCycler PCR (LC-PCR) and in 31 (20%) specimens by the in house PCR method. The LightCycler assay proved to be specific and 94% sensitive when compared to the in house PCR method. These features combined with the rapid turn-around time for results makes the LC-PCR particularly suitable for the detection of N. gonorrhoeae in a routine clinical laboratory. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Despite the social and (increasingly) commercial significance of sport and sporting bodies worldwide, they remain under-represented in the mainstream management literature. One of the more recent and dramatic examples of the global sports-media nexus is the 'Super League saga' in Australia. This paper recounts the tale of the Super League saga, providing a holistic analysis of the events and competitive issues arising by drawing on literatures concerning the economic nature and value of sports leagues, the resource-based view of the firm and the nature of psychological contracts in changing environments. The analysis confirms the general monopolistic tendencies of professional sports leagues in an increasingly global industry driven by the sports-media nexus, in accord with a number of comparable cases internationally. The particular conditions of the Australian marketplace that exacerbate this tendency beyond, for example, that found in the USA, and differences in the outcomes of battles between rival leagues are also considered. The Super League saga portrays the importance of effective management of resources key to the production of the 'rugby league product' including, among others, the often over-looked importance of careful management of local resources for the success of global strategies, and, where human resources are key, the importance of psychological contracting. The holistic analysis of the Super League saga in Australia affords lessons that extend well beyond the realm of sports.
Resumo:
Previous studies have shown a significant effect of insulin administration on serum dehydroepiandrosterone sulfate (DHEA-S) concentration and its metabolic rate, with evidence for the effect in men, but not in women. This could lead to differences in the sources of variation in serum DHEA-S between men and women and in its covariation with insulin concentration. This study aimed to test whether these hypotheses were supported in a sample of healthy adult twins. Serum DHEA-S (n=2287) and plasma insulin (n=2436) were measured in samples from adult male and female twins recruited through the Australian Twin Registry. Models of genetic and environmental sources of variation and covariation were tested against the data. DHEA-S showed substantial genetic effects in both men and women after adjustment for covariates, including sex, age, body mass index, and time since the last meal. There was no significant phenotypic or genetic correlation between DHEA-S and insulin in either men or women. Despite the experimental evidence for insulin infusion producing a reduction in serum DHEA-S and some effect of meals on the observed DHEA-S concentration, there were no associations between insulin and DHEA-S at the population level. Variations in DHEA-S are due to age, sex, obesity, and substantial polygenic genetic influences.
Resumo:
The majority of the world's population now resides in urban environments and information on the internal composition and dynamics of these environments is essential to enable preservation of certain standards of living. Remotely sensed data, especially the global coverage of moderate spatial resolution satellites such as Landsat, Indian Resource Satellite and Systeme Pour I'Observation de la Terre (SPOT), offer a highly useful data source for mapping the composition of these cities and examining their changes over time. The utility and range of applications for remotely sensed data in urban environments could be improved with a more appropriate conceptual model relating urban environments to the sampling resolutions of imaging sensors and processing routines. Hence, the aim of this work was to take the Vegetation-Impervious surface-Soil (VIS) model of urban composition and match it with the most appropriate image processing methodology to deliver information on VIS composition for urban environments. Several approaches were evaluated for mapping the urban composition of Brisbane city (south-cast Queensland, Australia) using Landsat 5 Thematic Mapper data and 1:5000 aerial photographs. The methods evaluated were: image classification; interpretation of aerial photographs; and constrained linear mixture analysis. Over 900 reference sample points on four transects were extracted from the aerial photographs and used as a basis to check output of the classification and mixture analysis. Distinctive zonations of VIS related to urban composition were found in the per-pixel classification and aggregated air-photo interpretation; however, significant spectral confusion also resulted between classes. In contrast, the VIS fraction images produced from the mixture analysis enabled distinctive densities of commercial, industrial and residential zones within the city to be clearly defined, based on their relative amount of vegetation cover. The soil fraction image served as an index for areas being (re)developed. The logical match of a low (L)-resolution, spectral mixture analysis approach with the moderate spatial resolution image data, ensured the processing model matched the spectrally heterogeneous nature of the urban environments at the scale of Landsat Thematic Mapper data.
Resumo:
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Animal-based theories of Pavlovian conditioning propose that patterning discriminations are solved using unique cues or immediate configuring. Recent studies with humans, however, provided evidence that in positive and negative patterning two different rules are utilized. The present experiment was designed to provide further support for this proposal by tracking the time course of the allocation of cognitive resources. One group was trained in a positive patterning; schedule (A-, B-, AB+) and a second in a negative patterning schedule (A+, B+, AB-). Electrodermal responses and secondary task probe reaction time were measured. In negative patterning, reaction times were slower during reinforced stimuli than during non-reinforced stimuli at both probe positions while there were no differences in positive patterning. These results support the assumption that negative patterning is solved using a rule that is more complex and requires more resources than does the rule employed to solve positive patterning. (C) 2001 Elsevier Science (USA).
Resumo:
We introduce a model of computation based on read only memory (ROM), which allows us to compare the space-efficiency of reversible, error-free classical computation with reversible, error-free quantum computation. We show that a ROM-based quantum computer with one writable qubit is universal, whilst two writable bits are required for a universal classical ROM-based computer. We also comment on the time-efficiency advantages of quantum computation within this model.
Resumo:
The study of viral-based processes is hampered by (a) their complex, transient nature, (b) the instability of products, and (c) the lack of accurate diagnostic assays. Here, we describe the use of real-time quantitative polymerase chain reaction to characterize baculoviral infection. Baculovirus DNA content doubles every 1.7 h from 6 h post-infection until replication is halted at the onset of budding. No dynamic equilibrium exists between replication and release, and the kinetics are independent of the cell density at the time of infection. No more than 16% of the intracellular virus copies bud from the cell. (C) 2002 John Wiley & Sons, Inc. Biotechnol Bioeng 77: 476-480, 2002; DOI 10.1002/bit.10126.