826 resultados para ligand-based virtual screening
Resumo:
The 3D Water Chemistry Atlas is an intuitive, open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model (formation and aquifer strata). This paper firstly describes the results of evaluating existing virtual globe technologies, which led to the decision to use the Cesium open source WebGL Virtual Globe and Map Engine as the underlying platform. Next it describes the backend database and search, filtering, browse and analysis tools that were developed to enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about coal seam gas extraction, waste water extraction, and water reuse.
Resumo:
Aim Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment. Ongoing work seeks to determine the impact of simulation on clinical skills.
Resumo:
By the time students reach the middle years they have experienced many chance activities based on dice. Common among these are rolling one die to explore the relationship of frequency and theoretical probability, and rolling two dice and summing the outcomes to consider their probabilities. Although dice may be considered overused by some, the advantage they offer is a familiar context within which to explore much more complex concepts. If the basic chance mechanism of the device is understood, it is possible to enter quickly into an arena of more complex concepts. This is what happened with a two hour activity engaged in by four classes of Grade 6 students in the same school. The activity targeted the concepts of variation and expectation. The teachers held extended discussions with their classes on variation and expectation at the beginning of the activity, with students contributing examples of the two concepts from their own experience. These notions are quite sophisticated for Grade 6, but the underlying concepts describe phenomena that students encounter every day. For example, time varies continuously; sporting results vary from game to game; the maximum temperature varies from day to day. However, there is an expectation about tomorrow’s maximum temperature based on the expert advice from the weather bureau. There may also be an expectation about a sporting result based on the participants’ previous results. It is this juxtaposition that makes life interesting. Variation hence describes the differences we see in phenomena around us. In a scenario displaying variation, expectation describes the effort to characterise or summarise the variation and perhaps make a prediction about the message arising from the scenario. The explicit purpose of the activity described here was to use the familiar scenario of rolling a die to expose these two concepts. Because the students had previously experienced rolling physical dice they knew instinctively about the variation that occurs across many rolls and about the theoretical expectation that each side should “come up” one-sixth of the time. They had observed the instances of the concepts in action, but had not consolidated the underlying terminology to describe it. As the two concepts are so fundamental to understanding statistics, we felt it would be useful to begin building in the familiar environment of rolling a die. Because hand-held dice limit the explorations students can undertake, the classes used the soft-ware TinkerPlots (Konold & Miller, 2011) to simulate rolling a die multiple times.
Resumo:
Biopanning of phage-displayed random peptide libraries is a powerful technique for identifying peptides that mimic epitopes (mimotopes) for monoclonal antibodies (mAbs). However, peptides derived using polyclonal antisera may represent epitopes for a diverse range of antibodies. Hence following screening of phage libraries with polyclonal antisera, including autoimmune disease sera, a procedure is required to distinguish relevant from irrelevant phagotopes. We therefore applied the multiple sequence alignment algorithm PILEUP together with a matrix for scoring amino acid substitutions based on physicochemical properties to generate guide trees depicting relatedness of selected peptides. A random heptapeptide library was biopanned nine times using no selecting antibodies, immunoglobulin G (IgG) from sera of subjects with autoimmune diseases (primary biliary cirrhosis (PBC) and type 1 diabetes) and three murine ascites fluids that contained mAbs to overlapping epitope(s) on the Ross River Virus envelope protein 2. Peptides randomly sampled from the library were distributed throughout the guide tree of the total set of peptides whilst many of the peptides derived in the absence of selecting antibody aligned to a single cluster. Moreover peptides selected by different sources of IgG aligned to separate clusters, each with a different amino acid motif. These alignments were validated by testing all of the 53 phagotopes derived using IgG from PBC sera for reactivity by capture ELISA with antibodies affinity purified on the E2 subunit of the pyruvate dehydrogenase complex (PDC-E2), the major autoantigen in PBC: only those phagotopes that aligned to PBC-associated clusters were reactive. Hence the multiple sequence alignment procedure discriminates relevant from irrelevant phagotopes and thus a major difficulty with biopanning phage-displayed random peptide libraries with polyclonal antibodies is surmounted.
Resumo:
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.
Resumo:
Network topology and routing are two important factors in determining the communication costs of big data applications at large scale. As for a given Cluster, Cloud, or Grid system, the network topology is fixed and static or dynamic routing protocols are preinstalled to direct the network traffic. Users cannot change them once the system is deployed. Hence, it is hard for application developers to identify the optimal network topology and routing algorithm for their applications with distinct communication patterns. In this study, we design a CCG virtual system (CCGVS), which first uses container-based virtualization to allow users to create a farm of lightweight virtual machines on a single host. Then, it uses software-defined networking (SDN) technique to control the network traffic among these virtual machines. Users can change the network topology and control the network traffic programmingly, thereby enabling application developers to evaluate their applications on the same system with different network topologies and routing algorithms. The preliminary experimental results through both synthetic big data programs and NPB benchmarks have shown that CCGVS can represent application performance variations caused by network topology and routing algorithm.
Resumo:
Educating responsive graduates. Graduate competencies include reliability, communication skills and ability to work in teams. Students using Collaborative technologies adapt to a new working environment, working in teams and using collaborative technologies for learning. Collaborative Technologies were used not simply for delivery of learning but innovatively to supplement and enrich research-based learning, providing a space for active engagement and interaction with resources and team. This promotes the development of responsive ‘intellectual producers’, able to effectively communicate, collaborate and negotiate in complex work environments. Exploiting technologies. Students use ‘new’ technologies to work collaboratively, allowing them to experience the reality of distributed workplaces incorporating both flexibility and ‘real’ time responsiveness. Students are responsible and accountable for individual and group work contributions in a highly transparent and readily accessible workspace. This experience provides a model of an effective learning tool. Navigating uncertainty and complexity. Collaborative technologies allows students to develop critical thinking and reflective skills as they develop a group product. In this forum students build resilience by taking ownership and managing group work, and navigating the uncertainties and complexities of group dynamics as they constructively and professionally engage in team dialogue and learn to focus on the goal of the team task.
Resumo:
Objective To develop a child victimization survey among a diverse group of child protection experts and examine the performance of the instrument through a set of international pilot studies. Methods The initial draft of the instrument was developed after input from scientists and practitioners representing 40 countries. Volunteers from the larger group of scientists participating in the Delphi review of the ICAST P and R reviewed the ICAST C by email in 2 rounds resulting in a final instrument. The ICAST C was then translated and back translated into six languages and field tested in four countries using a convenience sample of 571 children 12–17 years of age selected from schools and classrooms to which the investigators had easy access. Results The final ICAST C Home has 38 items and the ICAST C Institution has 44 items. These items serve as screeners and positive endorsements are followed by queries for frequency and perpetrator. Half of respondents were boys (49%). Endorsement for various forms of victimization ranged from 0 to 51%. Many children report violence exposure (51%), physical victimization (55%), psychological victimization (66%), sexual victimization (18%), and neglect in their homes (37%) in the last year. High rates of physical victimization (57%), psychological victimization (59%), and sexual victimization (22%) were also reported in schools in the last year. Internal consistency was moderate to high (alpha between .685 and .855) and missing data low (less than 1.5% for all but one item). Conclusions In pilot testing, the ICAST C identifies high rates of child victimization in all domains. Rates of missing data are low, and internal consistency is moderate to high. Pilot testing demonstrated the feasibility of using child self-report as one strategy to assess child victimization. Practice implications The ICAST C is a multi-national, multi-lingual, consensus-based survey instrument. It is available in six languages for international research to estimate child victimization. Assessing the prevalence of child victimization is critical in understanding the scope of the problem, setting national and local priorities, and garnering support for program and policy development aimed at child protection.
Resumo:
Several mechanisms have been proposed to explain the action of enzymes at the atomic level. Among them, the recent proposals involving short hydrogen bonds as a step in catalysis by Gerlt and Gassman [1] and proton transfer through low barrier hydrogen bonds (LBHBs) [2, 3] have attracted attention. There are several limitations to experimentally testing such hypotheses, Recent developments in computational methods facilitate the study of active site-ligand complexes to high levels of accuracy, Our previous studies, which involved the docking of the dinucleotide substrate UpA to the active site of RNase A [4, 5], enabled us to obtain a realistic model of the ligand-bound active site of RNase A. From these studies, based on empirical potential functions, we were able to obtain the molecular dynamics averaged coordinates of RNase A, bound to the ligand UpA. A quantum mechanical study is required to investigate the catalytic process which involves the cleavage and formation of covalent bonds. In the present study, we have investigated the strengths of some of the hydrogen bonds between the active site residues of RNase A and UpA at the ab initio quantum chemical level using the molecular dynamics averaged coordinates as the starting point. The 49 atom system and other model systems were optimized at the 3-21G level and the energies of the optimized systems were obtained at the 6-31G* level. The results clearly indicate the strengthening of hydrogen bonds between neutral residues due to the presence of charged species at appropriate positions. Such a strengthening manifests itself in the form of short hydrogen bonds and a low barrier for proton transfer. In the present study, the proton transfer between the 2'-OH of ribose (from the substrate) and the imidazole group from the H12 of RNase A is influenced by K41, which plays a crucial role in strengthening the neutral hydrogen bond, reducing the barrier for proton transfer.
Resumo:
This paper presents a motion control system for guidance of an underactuated Unmanned Underwater Vehicle (UUV) on a helical trajectory. The control strategy is developed using Port-Hamiltonian theory and interconnection and damping assignment passivity-based control. Using energy routing, the trajectory of a virtual fully actuated plant is guided onto a vector field. A tracking controller is then used that commands the underactuated plant to follow the velocity of the virtual plant. An integral control is inserted between the two control layers, which adds robustness and disturbance rejection to the design.
Resumo:
Solid materials can exist in different physical structures without a change in chemical composition. This phenomenon, known as polymorphism, has several implications on pharmaceutical development and manufacturing. Various solid forms of a drug can possess different physical and chemical properties, which may affect processing characteristics and stability, as well as the performance of a drug in the human body. Therefore, knowledge and control of the solid forms is fundamental to maintain safety and high quality of pharmaceuticals. During manufacture, harsh conditions can give rise to unexpected solid phase transformations and therefore change the behavior of the drug. Traditionally, pharmaceutical production has relied on time-consuming off-line analysis of production batches and finished products. This has led to poor understanding of processes and drug products. Therefore, new powerful methods that enable real time monitoring of pharmaceuticals during manufacturing processes are greatly needed. The aim of this thesis was to apply spectroscopic techniques to solid phase analysis within different stages of drug development and manufacturing, and thus, provide a molecular level insight into the behavior of active pharmaceutical ingredients (APIs) during processing. Applications to polymorph screening and different unit operations were developed and studied. A new approach to dissolution testing, which involves simultaneous measurement of drug concentration in the dissolution medium and in-situ solid phase analysis of the dissolving sample, was introduced and studied. Solid phase analysis was successfully performed during different stages, enabling a molecular level insight into the occurring phenomena. Near-infrared (NIR) spectroscopy was utilized in screening of polymorphs and processing-induced transformations (PITs). Polymorph screening was also studied with NIR and Raman spectroscopy in tandem. Quantitative solid phase analysis during fluidized bed drying was performed with in-line NIR and Raman spectroscopy and partial least squares (PLS) regression, and different dehydration mechanisms were studied using in-situ spectroscopy and partial least squares discriminant analysis (PLS-DA). In-situ solid phase analysis with Raman spectroscopy during dissolution testing enabled analysis of dissolution as a whole, and provided a scientific explanation for changes in the dissolution rate. It was concluded that the methods applied and studied provide better process understanding and knowledge of the drug products, and therefore, a way to achieve better quality.
Resumo:
Background From the conservative estimates of registrants with the National Diabetes Supply Scheme, we will be soon passing 1.1 Million Australians affected by all types of diabetes. The diabetes complications of foot ulceration and amputation are costly to all. These costs can be reduced with appropriate prevention strategies, starting with identifying people at risk through primary care diabetic foot screening. Yet levels of diabetic foot screening in Australia are difficult to quantify. This presentation aims to report on foot screening rates as recorded in existing academic literature, national health surveys and national database reports. Methods Literature searches included diabetic foot screening that occurred in the primary care setting for populations over 2000 people from 2002 to 2014. Searches were performed using Medline and CINAHL as well as internet searches of Organisations for Economic Co-operation and Development (OECD) countries health databases. The focus is on type 1 and type 2 diabetes in adults, and not gestational diabetes or children. The two primary outcome measures were foot -screening rates as a percentage of adult diabetic population and major lower limb amputation incidence rates from standardised OECD data. Results The most recent and accurate level for Australian population review was in the AUSDIAB (Australian Diabetes and lifestyle survey) from 2004. This survey reported screening in primary care to be as low as 50%. Countries such as the United Kingdom and United States of America have much higher reported rates of foot screening (67-86%) recorded using national databases and web based initiatives that involve patients and clinicians. By comparison major amputation rates for Australia were similar to the United Kingdom at 6.5 versus 5.1 per 100,000 population, but dis-similar to the United States of America at 17 per 100,000 population. Conclusions Australian rates of diabetic foot screening in primary care centres is ambiguous. There is no direct relationship between foot screening levels in a primary care environment and major lower limb amputation, based on national health survey's and OECD data. Uptake of national registers, incentives and web-based systems improve levels of diabetic foot assessment, which are the first steps to a healthier diabetic population.
Resumo:
We present a low-frequency electrical noise measurement in graphene based field effect transistors. For single layer graphene (SLG), the resistance fluctuations is governed by the screening of the charge impurities by the mobile charges. However, in case of Bilayer graphene (BLG), the electrical noise is strongly connected to its band structure, and unlike single layer graphene, displays a minimum when the gap between the conduction and valence band is zero. Using double gated BLG devices we have tuned the zero gap and charge neutrality points independently, which offers a versatile mechanism to investigate the low-energy band structure, charge localization and screening properties of bilayer graphene
Resumo:
Objective To evaluate health practitioners’ confidence and knowledge of alcohol screening, brief intervention and referral after training in a culturally adapted intervention on alcohol misuse and well-being issues for trauma patients. Design Mixed methods, involving semi-structured interviews at baseline and a post-workshop questionnaire. Setting: Targeted acute care within a remote area major tertiary referral hospital. Participants Ten key informants and 69 questionnaire respondents from relevant community services and hospital-based health care professionals. Intervention Screening and brief intervention training workshops and resources for 59 hospital staff. Main outcome measures Self-reported staff knowledge of alcohol screening, brief intervention and referral, and satisfaction with workshop content and format. Results After training, 44% of participants reported being motivated to implement alcohol screening and intervention. Satisfaction with training was high, and most participants reported that their knowledge of screening and brief intervention was improved. Conclusion Targeted educational interventions can improve the knowledge and confidence of inpatient staff who manage patients at high risk of alcohol use disorder. Further research is needed to determine the duration of the effect and influence on practice behaviour. Ongoing integrated training, linked with systemic support and established quality improvement processes, is required to facilitate sustained change and widespread dissemination.
Resumo:
The aim of the study was to analyze and facilitate collaborative design in a virtual learning environment (VLE). Discussions of virtual design in design education have typically focused on technological or communication issues, not on pedagogical issues. Yet in order to facilitate collaborative design, it is also necessary to address the pedagogical issues related to the virtual design process. In this study, the progressive inquiry model of collaborative designing was used to give a structural level of facilitation to students working in the VLE. According to this model, all aspects of inquiry, such as creating the design context, constructing a design idea, evaluating the idea, and searching for new information, can be shared in a design community. The study consists of three design projects: 1) designing clothes for premature babies, 2) designing conference bags for an international conference, and 3) designing tactile books for visually impaired children. These design projects constituted a continuum of design experiments, each of which highlighted certain perspectives on collaborative designing. The design experiments were organized so that the participants worked in design teams, both face-to-face and virtually. The first design experiment focused on peer collaboration among textile teacher students in the VLE. The second design experiment took into consideration end-users needs by using a participatory design approach. The third design experiment intensified computer-supported collaboration between students and domain experts. The virtual learning environments, in these design experiments, were designed to support knowledge-building pedagogy and progressive inquiry learning. These environments enabled a detailed recording of all computer-mediated interactions and data related to virtual designing. The data analysis was based on qualitative content analysis of design statements in the VLE. This study indicated four crucial issues concerning collaborative design in the VLE in craft and design education. Firstly, using the collaborative design process in craft and design education gives rise to special challenges of building learning communities, creating appropriate design tasks for them, and providing tools for collaborative activities. Secondly, the progressive inquiry model of collaborative designing can be used as a scaffold support for design thinking and for reflection on the design process. Thirdly, participation and distributed expertise can be facilitated by considering the key stakeholders who are related to the design task or design context, and getting them to participate in virtual designing. Fourthly, in the collaborative design process, it is important that team members create and improve visual and technical ideas together, not just agree or disagree about proposed ideas. Therefore, viewing the VLE as a medium for collaborative construction of the design objects appears crucial in order to understand and facilitate the complex processes in collaborative designing.