983 resultados para Virtual library
Resumo:
Aim Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment. Ongoing work seeks to determine the impact of simulation on clinical skills.
Resumo:
In the past few years, the virtual machine (VM) placement problem has been studied intensively and many algorithms for the VM placement problem have been proposed. However, those proposed VM placement algorithms have not been widely used in today's cloud data centers as they do not consider the migration cost from current VM placement to the new optimal VM placement. As a result, the gain from optimizing VM placement may be less than the loss of the migration cost from current VM placement to the new VM placement. To address this issue, this paper presents a penalty-based genetic algorithm (GA) for the VM placement problem that considers the migration cost in addition to the energy-consumption of the new VM placement and the total inter-VM traffic flow in the new VM placement. The GA has been implemented and evaluated by experiments, and the experimental results show that the GA outperforms two well known algorithms for the VM placement problem.
Resumo:
Although live VM migration has been intensively studied, the problem of live migration of multiple interdependent VMs has hardly been investigated. The most important problem in the live migration of multiple interdependent VMs is how to schedule VM migrations as the schedule will directly affect the total migration time and the total downtime of those VMs. Aiming at minimizing both the total migration time and the total downtime simultaneously, this paper presents a Strength Pareto Evolutionary Algorithm 2 (SPEA2) for the multi-VM migration scheduling problem. The SPEA2 has been evaluated by experiments, and the experimental results show that the SPEA2 can generate a set of VM migration schedules with a shorter total migration time and a shorter total downtime than an existing genetic algorithm, namely Random Key Genetic Algorithm (RKGA). This paper also studies the scalability of the SPEA2.
Resumo:
By the time students reach the middle years they have experienced many chance activities based on dice. Common among these are rolling one die to explore the relationship of frequency and theoretical probability, and rolling two dice and summing the outcomes to consider their probabilities. Although dice may be considered overused by some, the advantage they offer is a familiar context within which to explore much more complex concepts. If the basic chance mechanism of the device is understood, it is possible to enter quickly into an arena of more complex concepts. This is what happened with a two hour activity engaged in by four classes of Grade 6 students in the same school. The activity targeted the concepts of variation and expectation. The teachers held extended discussions with their classes on variation and expectation at the beginning of the activity, with students contributing examples of the two concepts from their own experience. These notions are quite sophisticated for Grade 6, but the underlying concepts describe phenomena that students encounter every day. For example, time varies continuously; sporting results vary from game to game; the maximum temperature varies from day to day. However, there is an expectation about tomorrow’s maximum temperature based on the expert advice from the weather bureau. There may also be an expectation about a sporting result based on the participants’ previous results. It is this juxtaposition that makes life interesting. Variation hence describes the differences we see in phenomena around us. In a scenario displaying variation, expectation describes the effort to characterise or summarise the variation and perhaps make a prediction about the message arising from the scenario. The explicit purpose of the activity described here was to use the familiar scenario of rolling a die to expose these two concepts. Because the students had previously experienced rolling physical dice they knew instinctively about the variation that occurs across many rolls and about the theoretical expectation that each side should “come up” one-sixth of the time. They had observed the instances of the concepts in action, but had not consolidated the underlying terminology to describe it. As the two concepts are so fundamental to understanding statistics, we felt it would be useful to begin building in the familiar environment of rolling a die. Because hand-held dice limit the explorations students can undertake, the classes used the soft-ware TinkerPlots (Konold & Miller, 2011) to simulate rolling a die multiple times.
Resumo:
The aim of this study was to investigate the molecular basis of human IgE-allergen interaction by screening a phage-displayed peptide library with an allergen-specific human IgE-mimicking monoclonal antibody (mAb). A mAb that reacted with major grass pollen allergens was successfully identified and shown to inhibit human IgE-allergen interaction. Biopanning of a phage-displayed random peptide library with this mAb yielded a 12 amino acid long mimotope. A synthetic peptide based on this 12-mer mimotope inhibited mAb and human IgE binding to grass pollen extracts. Our results indicate that such synthetic peptide mimotopes of allergens have potential as novel therapeutic agents. © 2001 Published by Elsevier Science B.V. on behalf of the Federation of European Biochemical Societies.
Resumo:
The chemokine receptor CCR5 contains seven transmembrane-spanning domains. It binds chemokines and acts as co-receptor for macrophage (m)-tropic (or R5) strains of HIV-1. Monoclonal antibodies (mAb) to CCR5, 3A9 and 5C7, were used for biopanning a nonapeptide cysteine (C)-constrained phage-displayed random peptide library to ascertain contact residues and define tertiary structures of possible epitopes on CCR5. Reactivity of antibodies with phagotopes was established by enzyme-linked immunosorbent assay (ELISA). mAb 3A9 identified a phagotope C-HASIYDFGS-C (3A9/1), and 5C7 most frequently identified C-PHWLRDLRV-C (5C7/1). Corresponding peptides were synthesized. Phagotopes and synthetic peptides reacted in ELISA with corresponding antibodies and synthetic peptides inhibited antibody binding to the phagotopes. Reactivity by immunofluorescence of 3A9 with CCR5 was strongly inhibited by the corresponding peptide. Both mAb 3A9 and 5C7 reacted similarly with phagotopes and the corresponding peptide selected by the alternative mAb. The sequences of peptide inserts of phagotopes could be aligned as mimotopes of the sequence of CCR5. For phage 3A9/1, the motif SIYD aligned to residues at the N terminus and FG to residues on the first extracellular loop; for 5C7/1, residues at the N terminus, first extracellular loop, and possibly the third extracellular loop could be aligned and so would contribute to the mimotope. The synthetic peptides corresponding to the isolated phagotopes showed a CD4-dependent reactivity with gp120 of a primary, m-tropic HIV-1 isolate. Thus reactivity of antibodies raised to CCR5 against phage-displayed peptides defined mimotopes that reflect binding sites for these antibodies and reveal a part of the gp120 binding sites on CCR5.
Resumo:
The sheep (Ovis aries) is favored by many musculoskeletal tissue engineering groups as a large animal model because of its docile temperament and ease of husbandry. The size and weight of sheep are comparable to humans, which allows for the use of implants and fixation devices used in human clinical practice. The construction of a complimentary DNA (cDNA) library can capture the expression of genes in both a tissue- and time-specific manner. cDNA libraries have been a consistent source of gene discovery ever since the technology became commonplace more than three decades ago. Here, we describe the construction of a cDNA library using cells derived from sheep bones based on the pBluescript cDNA kit. Thirty clones were picked at random and sequenced. This led to the identification of a novel gene, C12orf29, which our initial experiments indicate is involved in skeletal biology. We also describe a polymerase chain reaction-based cDNA clone isolation method that allows the isolation of genes of interest from a cDNA library pool. The techniques outlined here can be applied in-house by smaller tissue engineering groups to generate tools for biomolecular research for large preclinical animal studies and highlights the power of standard cDNA library protocols to uncover novel genes.
Resumo:
Network topology and routing are two important factors in determining the communication costs of big data applications at large scale. As for a given Cluster, Cloud, or Grid system, the network topology is fixed and static or dynamic routing protocols are preinstalled to direct the network traffic. Users cannot change them once the system is deployed. Hence, it is hard for application developers to identify the optimal network topology and routing algorithm for their applications with distinct communication patterns. In this study, we design a CCG virtual system (CCGVS), which first uses container-based virtualization to allow users to create a farm of lightweight virtual machines on a single host. Then, it uses software-defined networking (SDN) technique to control the network traffic among these virtual machines. Users can change the network topology and control the network traffic programmingly, thereby enabling application developers to evaluate their applications on the same system with different network topologies and routing algorithms. The preliminary experimental results through both synthetic big data programs and NPB benchmarks have shown that CCGVS can represent application performance variations caused by network topology and routing algorithm.
Resumo:
Educating responsive graduates. Graduate competencies include reliability, communication skills and ability to work in teams. Students using Collaborative technologies adapt to a new working environment, working in teams and using collaborative technologies for learning. Collaborative Technologies were used not simply for delivery of learning but innovatively to supplement and enrich research-based learning, providing a space for active engagement and interaction with resources and team. This promotes the development of responsive ‘intellectual producers’, able to effectively communicate, collaborate and negotiate in complex work environments. Exploiting technologies. Students use ‘new’ technologies to work collaboratively, allowing them to experience the reality of distributed workplaces incorporating both flexibility and ‘real’ time responsiveness. Students are responsible and accountable for individual and group work contributions in a highly transparent and readily accessible workspace. This experience provides a model of an effective learning tool. Navigating uncertainty and complexity. Collaborative technologies allows students to develop critical thinking and reflective skills as they develop a group product. In this forum students build resilience by taking ownership and managing group work, and navigating the uncertainties and complexities of group dynamics as they constructively and professionally engage in team dialogue and learn to focus on the goal of the team task.
Resumo:
Many software applications extend their functionality by dynamically loading libraries into their allocated address space. However, shared libraries are also often of unknown provenance and quality and may contain accidental bugs or, in some cases, deliberately malicious code. Most sandboxing techniques which address these issues require recompilation of the libraries using custom tool chains, require significant modifications to the libraries, do not retain the benefits of single address-space programming, do not completely isolate guest code, or incur substantial performance overheads. In this paper we present LibVM, a sandboxing architecture for isolating libraries within a host application without requiring any modifications to the shared libraries themselves, while still retaining the benefits of a single address space and also introducing a system call inter-positioning layer that allows complete arbitration over a shared library’s functionality. We show how to utilize contemporary hardware virtualization support towards this end with reasonable performance overheads and, in the absence of such hardware support, our model can also be implemented using a software-based mechanism. We ensure that our implementation conforms as closely as possible to existing shared library manipulation functions, minimizing the amount of effort needed to apply such isolation to existing programs. Our experimental results show that it is easy to gain immediate benefits in scenarios where the goal is to guard the host application against unintentional programming errors when using shared libraries, as well as in more complex scenarios, where a shared library is suspected of being actively hostile. In both cases, no changes are required to the shared libraries themselves.
Resumo:
DNA obtained from a human sputum isolate of Mycobacterium tuberculosis, NTI-64719, which showed extensive dissemination in the guinea pig model resulting in a high score for virulence was used to construct an expression library in the lambda ZAP vector. The size of DNA inserts in the library ranged from 1 to 3 kb, and recombinants represented 60% of the total plaques obtained. When probed with pooled serum from chronically infected tuberculosis patients, the library yielded 176 recombinants with a range of signal intensities. Among these, 93 recombinants were classified into 12 groups on the basis of DNA hybridization experiments, The polypeptides synthesized by the recombinants were predominantly LacZ fusion proteins, Serum obtained from patients who were clinically diagnosed to be in the early phase of M. tuberculosis infection was used to probe the 176 recombinants obtained. interestingly, some recombinants that gave very strong signals in the original screen did not react with early-phase serum; conversely, others whose signals were extremely weak in the original screen gave very intense signals with serum from recently infected patients, This indicates the differential nature of either the expression of these antigens or the immune response elicited by them as a function of disease progression.
Resumo:
Sheona Thomson writes about the transformation of a public institution in relation to a study on post-occupancy. "Brisbanites like me, with memories of long hours of study in the former buildings of the State Library of Queensland, can only marvel at the living institution we have in our city today. For most of the 80s, our bookish pursuits were hosted in the fustily intimate reading rooms of Centennial Hall, the late 1950s extension to the nineteenth-century building (formerly housing the state museum) by F. D. G. Stanley in William Street on the north bank of the Brisbane River. At the time, the wheels of an expansive cultural ambition were turning, and piece by piece on the south bank of the river the rambling Queensland Cultural Centre was realized. The fourth stage of the complex opened in 1988 as the new home for the State Library and for many years after, countless studious, transient folk whiled away time in the deep interiors of the straight-faced concrete and glass edifice by Robin Gibson and Partners..."
Resumo:
A 26-hour English reading comprehension course was taught to two groups of second year Finnish Pharmacy students: a virtual group (33 students) and a teacher-taught group (25 students). The aims of the teaching experiment were to find out: 1.What has to be taken into account when teaching English reading comprehension to students of pharmacy via the Internet and using TopClass? 2. How will the learning outcomes of the virtual group and the control group differ? 3. How will the students and the Department of Pharmacy respond to the different and new method, i.e. the virtual teaching method? 4. Will it be possible to test English reading comprehension learning material using the groupware tool TopClass? The virtual exercises were written within the Internet authoring environment, TopClass. The virtual group was given the reading material and grammar booklet on paper, but they did the reading comprehension tasks (written by the teacher), autonomously via the Internet. The control group was taught by the same teacher in 12 2-hour sessions, while the virtual group could work independently within the given six weeks. Both groups studied the same material: ten pharmaceutical articles with reading comprehension tasks as well as grammar and vocabulary exercises. Both groups took the same final test. Students in both groups were asked to evaluate the course using a 1 to 5 rating scale and they were also asked to assess their respective courses verbally. A detailed analysis of the different aspects of the student evaluation is given. Conclusions: 1.The virtual students learned pharmaceutical English relatively well but not significantly better than the classroom students 2. The overall student satisfaction in the virtual pharmacy English reading comprehension group was found to be higher than that in the teacher-taught control group. 3. Virtual learning is easier for linguistically more able students; less able students need more time with the teacher. 4. The sample in this study is rather small, but it is a pioneering study. 5. The Department of Pharmacy in the University of Helsinki wishes to incorporate virtual English reading comprehension teaching in its curriculum. 6. The sophisticated and versatile TopClass system is relatively easy for a traditional teacher and quite easy for the students to learn. It can be used e.g. for automatic checking of routine answers and document transfer, which both lighten the workloads of both parties. It is especially convenient for teaching reading comprehension. Key words: English reading comprehension, teacher-taught class, virtual class, attitudes of students, learning outcomes
Resumo:
My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.
Resumo:
Representational Difference Analysis (RDA) is an established technique used for isolation of specific genetic differences between or within bacterial species. This method was used to investigate the genetic basis of serovar-specificity and the relationship between serovar and virulence in Haemophilus parasuis. An RDA clone library of 96 isolates was constructed using H. parasuis strains H425(P) (serovar 12) and HS1967 (serovar 4). To screen such a large clone library to determine which clones are strain-specific would typically involved separately labelling each clone for use in Southern hybridisation against genomic DNA from each of the strains. In this study, a novel application of reverse Southern hybridisation was used to screen the RDA library: genomic DNA from each strain was labelled and used to probe the library to identify strain-specific clones. This novel approach represents a significant improvement in methodology that is rapid and efficient.