254 resultados para Core-specific Lectin
Resumo:
Tobacco plants were transformed with a chimeric transgene comprising sequences encoding β-glucuronidase (GUS) and the satellite RNA (satRNA) of cereal yellow dwarf luteovirus. When transgenic plants were infected with potato leafroll luteovirus (PLRV), which replicated the transgene-derived satRNA to a high level, the satellite sequence of the GUS:Sat transgene became densely methylated. Within the satellite region, all 86 cytosines in the upper strand and 73 of the 75 cytosines in the lower strand were either partially or fully methylated. In contrast, very low levels of DNA methylation were detected in the satellite sequence of the transgene in uninfected plants and in the flanking nonsatellite sequences in both infected and uninfected plants. Substantial amounts of truncated GUS:Sat RNA accumulated in the satRNA-replicating plants, and most of the molecules terminated at nucleotides within the first 60 bp of the satellite sequence. Whereas this RNA truncation was associated with high levels of satRNA replication, it appeared to be independent of the levels of DNA methylation in the satellite sequence, suggesting that it is not caused by methylation. All the sequenced GUS:Sat DNA molecules were hypermethylated in plants with replicating satRNA despite the phloem restriction of the helper PLRV. Also, small, sense and antisense ∼22 nt RNAs, derived from the satRNA, were associated with the replicating satellite. These results suggest that the sequence-specific DNA methylation spread into cells in which no satRNA replication occurred and that this was mediated by the spread of unamplified satRNA and/or its associated 22 nt RNA molecules.
Resumo:
Potato leafroll virus (PLRV) is a positive-strand RNA virus that generates subgenomic RNAs (sgRNA) for expression of 3' proximal genes. Small RNA (sRNA) sequencing and mapping of the PLRV-derived sRNAs revealed coverage of the entire viral genome with the exception of four distinctive gaps. Remarkably, these gaps mapped to areas of PLRV genome with extensive secondary structures, such as the internal ribosome entry site and 5' transcriptional start site of sgRNA1 and sgRNA2. The last gap mapped to ~500. nt from the 3' terminus of PLRV genome and suggested the possible presence of an additional sgRNA for PLRV. Quantitative real-time PCR and northern blot analysis confirmed the expression of sgRNA3 and subsequent analyses placed its 5' transcriptional start site at position 5347 of PLRV genome. A regulatory role is proposed for the PLRV sgRNA3 as it encodes for an RNA-binding protein with specificity to the 5' of PLRV genomic RNA. © 2013.
Resumo:
The expression patterns of GUS fusion constructs driven by the Agrobacterium rhizogenes RolC and the maize Sh (Shrunken: sucrose synthase-1) promoters were examined in transgenic potatoes (cv. Atlantic). RolC drove high-level gene expression in phloem tissue, bundle sheath cells and vascular parenchyma, but not in xylem or non-vascular tissues. Sh expression was exclusively confined to phloem tissue. Potato leafroll luteovirus (PLRV) replicates only in phloem tissues, and we show that when RolC is used to drive expression of the PLRV coat protein gene, virus-resistant lines can be obtained. In contrast, no significant resistance was observed when the Sh promoter was used.
Resumo:
Business Process Management (BPM) is accepted globally as an organizational approach to enhance productivity and drive cost efficiencies. Studies confirm a shortage of BPM skilled professionals with limited opportunities to develop the required BPM expertise. This study investigates this gap starting from a critical analysis of BPM courses offered by Australian universities and training institutions. These courses were analyzed and mapped against a leading BPM capability framework to determine how well current BPM education and training offerings in Australia address the core capabilities required by BPM professionals globally. To determine the BPM skill-sets sought by industry, online recruitment advertisements were collated, analyzed, and mapped against this BPM capability framework. The outcomes provide a detailed overview on the alignment of available BPM education/training and industry demand. These insights are useful for BPM professionals and their employers to build awareness of the BPM capabilities required for a BPM mature organization. Universities and other training institutions will benefit from these results by understanding where demand is, where the gaps are, and what other BPM education providers are supplying. This structured comparison method could continue to provide a common ground for future discussion across university-industry boundaries and continuous alignment of their respective practices.
Resumo:
Asking why is an important foundation of inquiry and fundamental to the development of reasoning skills and learning. Despite this, and despite the relentless and often disruptive nature of innovations in information and communications technology (ICT), sophisticated tools that directly support this basic act of learning appear to be undeveloped, not yet recognized, or in the very early stages of development. Why is this so? To this question, there is no single satisfactory answer; instead, numerous plausible explanations and related questions arise. After learning something, however, explaining why can be revealing of a person’s understanding (or lack of it). What then differentiates explanation from information; and, explanatory from descriptive content? What ICT scaffolding might support inquiry instigated by why-questioning? What is the role of reflective practice in inquiry-based learning? These and other questions have emerged from this investigation and underscore that why-questions often propagate further questions and are a catalyst for cognitive engagement and dialogue. This paper reports on a multi-disciplinary, theoretical investigation that informs the broad discourse on e-learning and points to a specific frontier for design and development of e-learning tools. Probing why reveals that versatile and ambiguous semantics present the core challenge – asking, learning, knowing, understanding, and explaining why.
Resumo:
Where a secured lender elects to appoint a receiver and manager, the appointment document standardly provides for the receiver and manager to act as the agent of the debtor. This article considers the significance of this agency in the context of three specific issues that have the potential to arise in the receivership of a corporate borrower across all Australian jurisdictions.
Resumo:
Current governance challenges facing the global games industry are heavily dominated by online games. Whilst much academic and industry attention has been afforded to Virtual Worlds, the more pressing contemporary challenges may arise in casual games, especially when found on social networks. As authorities are faced with an increasing volume of disputes between participants and platform operators, the likelihood of external regulation increases, and the role that such regulation would have on the industry – both internationally and within specific regions – is unclear. Kelly (2010) argues that “when you strip away the graphics of these [social] games, what you are left with is simply a button [...] You push it and then the game returns a value of either Win or Lose”. He notes that while “every game developer wants their game to be played, preferably addictively, because it’s so awesome”, these mechanics lead not to “addiction of engagement through awesomeness” but “the addiction of compulsiveness”, surmising that “the reality is that they’ve actually sort-of kind-of half-intentionally built a virtual slot machine industry”. If such core elements of social game design are questioned, this gives cause to question the real-money options to circumvent them. With players able to purchase virtual currency and speed the completion of tasks, the money invested by the 20% purchasing in-game benefits (Zainwinger, 2012) may well be the result of compulsion. The decision by the Japanese Consumer Affairs agency to investigate the ‘Kompu Gacha’ mechanic (in which players are rewarded for completing a set of items obtained through purchasing virtual goods such as mystery boxes), and the resultant verdict that such mechanics should be regulated through gambling legislation, demonstrates that politicians are beginning to look at the mechanics deployed in these environments. Purewal (2012) states that “there’s a reasonable argument that complete gacha would be regulated under gambling law under at least some (if not most) Western jurisdictions”. This paper explores the governance challenged within these games and platforms, their role in the global industry, and current practice amongst developers in the Australian and United States to address such challenges.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Introduction The clinically known importance of patient sex as a major risk factor for compromised bone healing is poorly reflected in animal models. Consequently, the underlying cellular mechanisms remain elusive. Because mesenchymal stem cells (MSCs) are postulated to regulate tissue regeneration and give rise to essential differentiated cell types, they may contribute to sex-specific differences in bone healing outcomes. Methods We investigated sex-specific variations in bone healing and associated differences in MSC populations. A 1.5 mm osteotomy gap in the femora of 8 male and 8 female 12-month-old Sprague-Dawley rats was stabilized by an external fixator. Healing was analyzed in terms of biomechanical testing, bridging and callus size over time (radiography at 2, 4, and 6 weeks after surgery), and callus volume and geometry by μCT at final follow-up. MSCs were obtained from bone marrow samples of an age-matched group of 12 animals (6 per gender) and analyzed for numbers of colony-forming units (CFUs) and their capacity to differentiate and proliferate. The proportion of senescent cells was determined by β-galactosidase staining. Results Sex-specific differences were indicated by a compromised mechanical competence of the callus in females compared with males (maximum torque at failure, p = 0.028). Throughout the follow-up, the cross-sectional area of callus relative to bone was reduced in females (p ≤ 0.01), and the bridging of callus was delayed (p 2weeks = 0.041). μCT revealed a reduced callus size (p = 0.003), mineralization (p = 0.003) and polar moment of inertia (p = 0.003) in female animals. The female bone marrow contained significantly fewer MSCs, represented by low CFU numbers in both femora and tibiae (p femur = 0.017, p tibia = 0.010). Functional characteristics of male and female MSCs were similar. Conclusion Biomechanically compromised and radiographically delayed bone formation were distinctive in female rats. These differences were concomitant with a reduced number of MSCs, which may be causative for the suboptimal bone healing.
Resumo:
In this paper we focus specifically on explaining variation in core human values, and suggest that individual differences in values can be partially explained by personality traits and the perceived ability to manage emotions in the self and others (i.e. trait emotional intelligence). A sample of 209 university students was used to test hypotheses regarding several proposed direct and indirect relationships between personality traits, trait emotional intelligence and values. Consistent with the hypotheses, Harm Avoidance and Novelty Seeking were found to directly predict Hedonism, Conformity, and Stimulation. Harm Avoidance was also found to indirectly predict these values through the mediating effects of key subscales of trait emotional intelligence. Novelty Seeking was not found to be an indirect predictor of values. Results have implications for our understanding of the relationship between personality, trait emotional intelligence and values, and suggest a common basis in terms of approach and avoidance pathways.
Resumo:
High-resolution, high-contrast, three-dimensional images of live cell and tissue architecture can be obtained using second harmonic generation (SHG), which comprises non-absorptive frequency changes in an excitation laser line. SHG does not require any exogenous antibody or fluorophore labeling, and can generate images from unstained sections of several key endogenous biomolecules, in a wide variety of species and from different types of processed tissue. Here, we examined normal control human skin sections and human burn scar tissues using SHG on a multi-photon microscope (MPM). Examination and comparison of normal human skin and burn scar tissue demonstrated a clear arrangement of fibers in the dermis, similar to dermal collagen fiber signals. Fluorescence-staining confirmed the MPM-SHG collagen colocalization with antibody staining for dermal collagen type-I but not fibronectin or elastin. Furthermore, we were able to detect collagen MPM-SHG signal in human frozen sections as well as in unstained paraffin embedded tissue sections that were then compared with hematoxylin and eosin staining in the identical sections. This same approach was also successful in localizing collagen in porcine and ovine skin samples, and may be particularly important when species-specific antibodies may not be available. Collectively, our results demonstrate that MPM SHG-detection is a useful tool for high resolution examination of collagen architecture in both normal and wounded human, porcine and ovine dermal tissue.
Resumo:
It is widely acknowledged that effective asset management requires an interdisciplinary approach, in which synergies should exist between traditional disciplines such as: accounting, engineering, finance, humanities, logistics, and information systems technologies. Asset management is also an important, yet complex business practice. Business process modelling is proposed as an approach to manage the complexity of asset management through the modelling of asset management processes. A sound foundation for the systematic application and analysis of business process modelling in asset management is, however, yet to be developed. Fundamentally, a business process consists of activities (termed functions), events/states, and control flow logic. As both events/states and control flow logic are somewhat dependent on the functions themselves, it is a logical step to first identify the functions within a process. This research addresses the current gap in knowledge by developing a method to identify functions common to various industry types (termed core functions). This lays the foundation to extract such functions, so as to identify both commonalities and variation points in asset management processes. This method describes the use of a manual text mining and a taxonomy approach. An example is presented.
Resumo:
Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.
Resumo:
Prescribing errors remain a significant cause of patient harm. Safe prescribing is not just about writing a prescription, but involves many cognitive and decision-making steps. A set of national prescribing competencies for all prescribers (including non-medical) is needed to guide education and training curricula, assessment and credentialing of individual practitioners. We have identified 12 core competencies for safe prescribing which embody the four stages of the prescribing process – information gathering, clinical decision making, communication, and monitoring and review. These core competencies, along with their learning objectives and assessment methods, provide a useful starting point for teaching safe and effective prescribing.
Context-specific stressors, work-related social support and work-family conflict : a mediation study
Resumo:
Understanding the antecedents of work-family conflict is important as it allows organisations to effectively engage in work design for professional employees. This study examines the impact of sources of social support as antecedents of work-family conflict. The hypotheses were tests using Partial Least Squares modelling on a sample of 366 professional employees. The path model showed that context-specific stressors impacted positively on job demand, which led to higher levels of work-family conflict. Contrary to our expectation, non-work related social support did not have any statistical relationship with job demand and work-family conflict. In addition, individuals experiencing high job demands were found to obtain more social support from both work and non-work-related sources. Individuals with more work-related social support were less likely to have less work-family conflict. Surprisingly, non-work social support sources had no statistically significant relationship with work-family conflict.