925 resultados para citation
Resumo:
Recombinant baculoviruses have established themselves as a favoured technology for the high-level expression of recombinant proteins. The construction of recombinant viruses, however, is a time consuming step that restricts consideration of the technology for high throughput developments. Here we use a targeted gene knockout technology to inactivate an essential viral gene that lies adjacent to the locus used for recombination. Viral DNA prepared from the knockout fails to initiate an infection unless rescued by recombination with a baculovirus transfer vector. Modified viral DNA allows 100% recombinant virus formation, obviates the need for further virus purification and offers an efficient means of mass parallel recombinant formation.
Resumo:
Coronary artery disease is one of the most common heart pathologies. Restriction of blood flow to the heart by atherosclerotic lesions, leading to angina pectoris and myocardial infarction, damages the heart, resulting in impaired cardiac function. Damaged myocardium is replaced by scar tissue since surviving cardiomyocytes are unable to proliferate to replace lost heart tissue. Although narrowing of the coronary arteries can be treated successfully using coronary revascularisation procedures, re-occlusion of the treated vessels remains a significant clinical problem. Cell cycle control mechanisms are key in both the impaired cardiac repair by surviving cardiomyocytes and re-narrowing of treated vessels by maladaptive proliferation of vascular smooth muscle cells. Strategies targeting the cell cycle machinery in the heart and vasculature offer promise both for the improvement of cardiac repair following MI and the prevention of restenosis and bypass graft failure following revascularisation procedures.
Resumo:
Stochastic Diffusion Search is an efficient probabilistic bestfit search technique, capable of transformation invariant pattern matching. Although inherently parallel in operation it is difficult to implement efficiently in hardware as it requires full inter-agent connectivity. This paper describes a lattice implementation, which, while qualitatively retaining the properties of the original algorithm, restricts connectivity, enabling simpler implementation on parallel hardware. Diffusion times are examined for different network topologies, ranging from ordered lattices, over small-world networks to random graphs.
Resumo:
One of the most pervading concepts underlying computational models of information processing in the brain is linear input integration of rate coded uni-variate information by neurons. After a suitable learning process this results in neuronal structures that statically represent knowledge as a vector of real valued synaptic weights. Although this general framework has contributed to the many successes of connectionism, in this paper we argue that for all but the most basic of cognitive processes, a more complex, multi-variate dynamic neural coding mechanism is required - knowledge should not be spacially bound to a particular neuron or group of neurons. We conclude the paper with discussion of a simple experiment that illustrates dynamic knowledge representation in a spiking neuron connectionist system.
Resumo:
In the earth sciences, data are commonly cast on complex grids in order to model irregular domains such as coastlines, or to evenly distribute grid points over the globe. It is common for a scientist to wish to re-cast such data onto a grid that is more amenable to manipulation, visualization, or comparison with other data sources. The complexity of the grids presents a significant technical difficulty to the regridding process. In particular, the regridding of complex grids may suffer from severe performance issues, in the worst case scaling with the product of the sizes of the source and destination grids. We present a mechanism for the fast regridding of such datasets, based upon the construction of a spatial index that allows fast searching of the source grid. We discover that the most efficient spatial index under test (in terms of memory usage and query time) is a simple look-up table. A kd-tree implementation was found to be faster to build and to give similar query performance at the expense of a larger memory footprint. Using our approach, we demonstrate that regridding of complex data may proceed at speeds sufficient to permit regridding on-the-fly in an interactive visualization application, or in a Web Map Service implementation. For large datasets with complex grids the new mechanism is shown to significantly outperform algorithms used in many scientific visualization packages.
Resumo:
This conference was an unusual and interesting event. Celebrating 25 years of Construction Management and Economics provides us with an opportunity to reflect on the research that has been reported over the years, to consider where we are now, and to think about the future of academic research in this area. Hence the sub-title of this conference: “past, present and future”. Looking through these papers, some things are clear. First, the range of topics considered interesting has expanded hugely since the journal was first published. Second, the research methods are also more diverse. Third, the involvement of wider groups of stakeholder is evident. There is a danger that this might lead to dilution of the field. But my instinct has always been to argue against the notion that Construction Management and Economics represents a discipline, as such. Granted, there are plenty of university departments around the world that would justify the idea of a discipline. But the vast majority of academic departments who contribute to the life of this journal carry different names to this. Indeed, the range and breadth of methodological approaches to the research reported in Construction Management and Economics indicates that there are several different academic disciplines being brought to bear on the construction sector. Some papers are based on economics, some on psychology and others on operational research, sociology, law, statistics, information technology, and so on. This is why I maintain that construction management is not an academic discipline, but a field of study to which a range of academic disciplines are applied. This may be why it is so interesting to be involved in this journal. The problems to which the papers are applied develop and grow. But the broad topics of the earliest papers in the journal are still relevant today. What has changed a lot is our interpretation of the problems that confront the construction sector all over the world, and the methodological approaches to resolving them. There is a constant difficulty in dealing with topics as inherently practical as these. While the demands of the academic world are driven by the need for the rigorous application of sound methods, the demands of the practical world are quite different. It can be difficult to meet the needs of both sets of stakeholders at the same time. However, increasing numbers of postgraduate courses in our area result in larger numbers of practitioners with a deeper appreciation of what research is all about, and how to interpret and apply the lessons from research. It also seems that there are contributions coming not just from construction-related university departments, but also from departments with identifiable methodological traditions of their own. I like to think that our authors can publish in journals beyond the construction-related areas, to disseminate their theoretical insights into other disciplines, and to contribute to the strength of this journal by citing our articles in more mono-disciplinary journals. This would contribute to the future of the journal in a very strong and developmental way. The greatest danger we face is in excessive self-citation, i.e. referring only to sources within the CM&E literature or, worse, referring only to other articles in the same journal. The only way to ensure a strong and influential position for journals and university departments like ours is to be sure that our work is informing other academic disciplines. This is what I would see as the future, our logical next step. If, as a community of researchers, we are not producing papers that challenge and inform the fundamentals of research methods and analytical processes, then no matter how practically relevant our output is to the industry, it will remain derivative and secondary, based on the methodological insights of others. The balancing act between methodological rigour and practical relevance is a difficult one, but not, of course, a balance that has to be struck in every single paper.
Resumo:
This study considers the strength of the Northern Hemisphere Holton-Tan effect (HTE) in terms of the phase alignment of the quasi-biennial oscillation (QBO) with respect to the annual cycle. Using the ERA-40 Reanalysis, it is found that the early winter (Nov–Dec) and late winter (Feb–Mar) relation between QBO phase and the strength of the stratospheric polar vortex is optimized for subsets of the 44-year record that are chosen on the basis of the seasonality of QBO phase transitions at the 30 hPa level. The timing of phase transitions serves as a proxy for changes in the vertical structure of the QBO over the whole depth of the tropical stratosphere. The statistical significance of the Nov–Dec (Feb–Mar) HTE is greatest when 30 hPa QBO phase transitions occur 9–14 (4–9) months prior to the January of the NH winter in question. This suggests that there exists for both early and late winter a vertical structure of tropical stratospheric winds that is most effective at influencing the interannual variability of the polar vortex, and that an early (late) winter HTE is associated with an early (late) progression of QBO phase towards that structure. It is also shown that the seasonality of QBO phase transitions at 30 hPa varies on a decadal timescale, with transitions during the first half of the calendar year being relatively more common during the first half of the tropical radiosonde wind record. Combining these two results suggests that decadal changes in HTE strength could result from the changing seasonality of QBO phase transitions. Citation: Anstey, J. A., and T. G. Shepherd (2008), Response of the northern stratospheric polar vortex to the seasonal alignment of QBO phase transitions, Geophys. Res. Lett., 35, L22810, doi:10.1029/2008GL035721.
Resumo:
This article examines utopian gestures and inaugural desires in two films which became symbolic of the Brazilian Film Revival in the late 1990s: Central Station (1998) and Midnight (1999). Both evolve around the idea of an overcrowded or empty centre in a country trapped between past and future, in which the motif of the zero stands for both the announcement and the negation of utopia. The analysis draws parallels between them and new wave films which also elaborate on the idea of the zero, with examples picked from Italian neo-realism, the Brazilian Cinema Novo and the New German Cinema. In Central Station, the ‘point zero’, or the core of the homeland, is retrieved in the archaic backlands, where political issues are resolved in the private sphere and the social drama turns into family melodrama. Midnight, in its turn, recycles Glauber Rocha’s utopian prophecies in the new millennium’s hour zero, when the earthly paradise represented by the sea is re-encountered by the middle-class character, but not by the poor migrant. In both cases, public injustice is compensated by the heroes’ personal achievements, but those do not refer to the real nation, its history or society. Their utopian breadth, based on nostalgia, citation and genre techniques, is of a virtual kind, attune to cinema only.
Resumo:
Point and click interactions using a mouse are an integral part of computer use for current desktop systems. Compared with younger users though, older adults experience greater difficulties performing cursor positioning tasks, and this can present limitations to using a computer easily and effectively. Target expansion is a technique for improving pointing performance, where the target dynamically grows as the cursor approaches. This has the advantage that targets conserve screen real estate in their unexpanded state, yet can still provide the benefits of a larger area to click on. This paper presents two studies of target expansion with older and younger participants, involving multidirectional point-select tasks with a computer mouse. Study 1 compares static versus expanding targets, and Study 2 compares static targets with three alternative techniques for expansion. Results show that expansion can improve times by up to 14%, and reduce error rates by up to 50%. Additionally, expanding targets are beneficial even when the expansion happens late in the movement, i.e. after the cursor has reached the expanded target area or even after it has reached the original target area. Participants’ subjective feedback on the target expansion are generally favorable, and this lends further support for the technique.
Resumo:
A growing segment of Chinese women are willing to spend a high percentage of their income on fashion related products, however there appears to be concern over the quality of Chinese fashion magazines. Concern can be focused in two major issues: i) fashion magazine design, and ii) pictorial and textual distribution of content. This paper investigates how human factors (i.e. social norms and individual differences) influence fashion magazine design/format preferences, and investigates the difference in readership patterns between British and Chinese Women. Our study identifies significant differences between UK and Chinese readership; which has an impact on magazine viewing patterns and content preference.
Resumo:
Traditionally, the formal scientific output in most fields of natural science has been limited to peer- reviewed academic journal publications, with less attention paid to the chain of intermediate data results and their associated metadata, including provenance. In effect, this has constrained the representation and verification of the data provenance to the confines of the related publications. Detailed knowledge of a dataset’s provenance is essential to establish the pedigree of the data for its effective re-use, and to avoid redundant re-enactment of the experiment or computation involved. It is increasingly important for open-access data to determine their authenticity and quality, especially considering the growing volumes of datasets appearing in the public domain. To address these issues, we present an approach that combines the Digital Object Identifier (DOI) – a widely adopted citation technique – with existing, widely adopted climate science data standards to formally publish detailed provenance of a climate research dataset as an associated scientific workflow. This is integrated with linked-data compliant data re-use standards (e.g. OAI-ORE) to enable a seamless link between a publication and the complete trail of lineage of the corresponding dataset, including the dataset itself.
Resumo:
Spatial memory is important for locating objects in hierarchical data structures, such as desktop folders. There are, however, some contradictions in literature concerning the effectiveness of 3D user interfaces when compared to their 2D counterparts. This paper uses a task-based approach in order to investigate the effectiveness of adding a third dimension to specific user tasks, i.e. the impact of depth on navigation in a 3D file manager. Results highlight issues and benefits of using 3D interfaces for visual and verbal tasks, and introduces the possible existence of a correlation between aptitude scores achieved on the Guilford- Zimmerman Orientation Survey and Electroencephalography- measured brainwave activity as participants search for targets of variable perceptual salience in 2D and 3D environments.
Resumo:
Older adult computer users often lose track of the mouse cursor and so resort to methods such as shaking the mouse or searching the entire screen to find the cursor again. Hence, this paper describes how a standard optical mouse was modified to include a touch sensor, activated by releasing and touching the mouse, which automatically centers the mouse cursor to the screen, potentially making it easier to find a ‘lost’ cursor. Six older adult computer users and six younger computer users were asked to compare the touch sensitive mouse with cursor centering with two alternative techniques for locating the mouse cursor: manually shaking the mouse and using the Windows sonar facility. The time taken to click on a target after a distractor task was recorded, and results show that centering the mouse was the fastest to use, with a 35% improvement over shaking the mouse. Five out of six older participants ranked the touch sensitive mouse with cursor centering as the easiest to use.