330 resultados para BASIS-SET CONVERGENCE
Resumo:
IRE1 couples endoplasmic reticulum unfolded protein load to RNA cleavage events that culminate in the sequence-specific splicing of the Xbp1 mRNA and in the regulated degradation of diverse membrane-bound mRNAs. We report on the identification of a small molecule inhibitor that attains its selectivity by forming an unusually stable Schiff base with lysine 907 in the IRE1 endonuclease domain, explained by solvent inaccessibility of the imine bond in the enzyme-inhibitor complex. The inhibitor (abbreviated 4μ8C) blocks substrate access to the active site of IRE1 and selectively inactivates both Xbp1 splicing and IRE1-mediated mRNA degradation. Surprisingly, inhibition of IRE1 endonuclease activity does not sensitize cells to the consequences of acute endoplasmic reticulum stress, but rather interferes with the expansion of secretory capacity. Thus, the chemical reactivity and sterics of a unique residue in the endonuclease active site of IRE1 can be exploited by selective inhibitors to interfere with protein secretion in pathological settings.
Resumo:
A crucial task in contractor prequalification is to establish a set of decision criteria through which the capabilities of contractors are measured and judged. However, in the UK, there are no nationwide standards or guidelines governing the selection of decision criteria for contractor prequalification. The decision criteria are usually established by individual clients on an ad hoc basis. This paper investigates the divergence of decision criteria used by different client and consultant organisations in contractor prequalification through a large empirical survey conducted in the UK. The results indicate that there are significant differences in the selection and use of decision criteria for prequalification.
Resumo:
Two transgenic callus lines of rice, stably expressing a β-glucuronidase (GUS) gene, were supertransformed with a set of constructs designed to silence the resident GUS gene. An inverted-repeat (i/r) GUS construct, designed to produce mRNA with self-complementarity, was much more effective than simple sense and antisense constructs at inducing silencing. Supertransforming rice calluses with a direct-repeat (d/r) construct, although not as effective as those with the i/r construct, was also substantially more effective in silencing the resident GUS gene than the simple sense and antisense constructs. DNA hybridisation analyses revealed that every callus line supertransformed with either simple sense or antisense constructs, and subsequently showing GUS silencing, had the silence-inducing transgenes integrated into the plant genome in inverted-repeat configurations. The silenced lines containing i/r and d/r constructs did not necessarily have inverted-repeat T-DNA insertions. There was significant methylation of the GUS sequences in most of the silenced lines but not in the unsilenced lines. However, demethylation treatment of silenced lines with 5-azacytidine did not reverse the post-transcriptional gene silencing (PTGS) of GUS. Whereas the levels of RNA specific to the resident GUS gene were uniformly low in the silenced lines, RNA specific to the inducer transgenes accumulated to a substantial level, and the majority of the i/r RNA was unpolyadenylated. Altogether, these results suggest that both sense- and antisense-mediated gene suppression share a similar molecular basis, that unpolyadenylated RNA plays an important role in PTGS, and that methylation is not essential for PTGS.
Resumo:
In Roberts v Prendergast [2013] QCA 89 the respondent had offered to settle the appeal, purporting to make the offer under Chapter 9 Part 5 of the Uniform Civil Procedure Rules 1999 (Qld) (UCPR). Differing views were expressed in the Court of Appeal regarding the impact in the circumstances of the offer to settle, with the majority concluding that the appellant should pay the respondent’s costs on the standard basis.
Resumo:
Australia lacks a satisfactory, national paradigm for assessing legal capacity in the context of testamentary, enduring power of attorney and advance care directive documents. Capacity assessments are currently conducted on an ad hoc basis by legal and/or medical professionals. The reliability of the assessment process is subject to the skill set and mutual understanding of the legal and/or medical professional conducting the assessment. There is a growth in the prevalence of diseases such as dementia. Such diseases impact upon cognition which increasingly necessitates collaboration between the legal and medical professions when assessing the effect of mentally disabling conditions upon legal capacity. Miscommunication and lack of understanding between legal and medical professionals involved could impede the development of a satisfactory paradigm. This article will discuss legal capacity assessment in Australia and how to strengthen the relationship between legal and medical professionals involved in capacity assessments. The development of a national paradigm would promote consistency and transparency of process, helping to improve the professional relationship and maximising the principles of autonomy, participation and dignity.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
Fire incident in buildings is common, so the fire safety design of the framed structure is imperative, especially for the unprotected or partly protected bare steel frames. However, software for structural fire analysis is not widely available. As a result, the performance-based structural fire design is urged on the basis of using user-friendly and conventional nonlinear computer analysis programs so that engineers do not need to acquire new structural analysis software for structural fire analysis and design. The tool is desired to have the capacity of simulating the different fire scenarios and associated detrimental effects efficiently, which includes second-order P-D and P-d effects and material yielding. Also the nonlinear behaviour of large-scale structure becomes complicated when under fire, and thus its simulation relies on an efficient and effective numerical analysis to cope with intricate nonlinear effects due to fire. To this end, the present fire study utilizes a second order elastic/plastic analysis software NIDA to predict structural behaviour of bare steel framed structures at elevated temperatures. This fire study considers thermal expansion and material degradation due to heating. Degradation of material strength with increasing temperature is included by a set of temperature-stress-strain curves according to BS5950 Part 8 mainly, which implicitly allows for creep deformation. This finite element stiffness formulation of beam-column elements is derived from the fifth-order PEP element which facilitates the computer modeling by one member per element. The Newton-Raphson method is used in the nonlinear solution procedure in order to trace the nonlinear equilibrium path at specified elevated temperatures. Several numerical and experimental verifications of framed structures are presented and compared against solutions in literature. The proposed method permits engineers to adopt the performance-based structural fire analysis and design using typical second-order nonlinear structural analysis software.
Resumo:
Process mining has developed into a popular research discipline and nowadays its associated techniques are widely applied in practice. What is currently ill-understood is how the success of a process mining project can be measured and what the antecedent factors of process mining success are. We consider an improved, grounded understanding of these aspects of value to better manage the effectiveness and efficiency of process mining projects in practice. As such, we advance a model, tailored to the characteristics of process mining projects, which identifies and relates success factors and measures. We draw inspiration from the literature from related fields for the construction of a theoretical, a priori model. That model has been validated and re-specified on the basis of a multiple case study, which involved four industrial process mining projects. The unique contribution of this paper is that it presents the first set of success factors and measures on the basis of an analysis of real process mining projects. The presented model can also serve as a basis for further extension and refinement using insights from additional analyses.
Resumo:
This creative work is the production of the live and animated performance of The Empty City. With a significant period of creative development and script work behind it, the team engaged in a range of innovative performance-making practices in order to realise the work onstage as a non-verbal live and animated theatre work. This intermedial process was often led by music, and involved the creation and convergence of non-verbal action, virtual performers, performing objects and two simultaneous projections of animated images. The production opened at the Brisbane Powerhouse on June 27 2013, with a subsequent tour to Perth’s Awesome Festival in October 2013. Its technical achievements were noted in the critical responses. "The story is told on a striking set of two huge screens, the front one transparent, upon which still and moving images are projected, and between which Oliver performs and occasional “real” objects are placed. The effect is startling, and creates a cartoon three dimensionality like those old Viewmaster slide shows. The live action… and soundscape sync perfectly with the projected imagery to complete a dense, intricately devised and technically brilliant whole." (The West Australian 14.10.13)
Resumo:
Histological analysis of gill samples taken from individuals of Latris lineata reared in aquaculture in Tasmania, Australia, and those sampled from the wild revealed the presence of epitheliocystis-like basophilic inclusions. Subsequent morphological, in situ hybridization, and molecular analyses were performed to confirm the presence of this disease and discovered a Chlamydia-like organism associated with this condition, and the criteria set by Fredericks and Relman's postulates were used to establish disease causation. Three distinct 16S rRNA genotypes were sequenced from 16 fish, and phylogenetic analyses of the nearly full-length 16S rRNA sequences generated for this bacterial agent indicated that they were nearly identical novel members of the order Chlamydiales. This new taxon formed a well-supported clade with "Candidatus Parilichlamydia carangidicola" from the yellowtail kingfish (Seriola lalandi). On the basis of sequence divergence over the 16S rRNA region relative to all other members of the order Chlamydiales, a new genus and species are proposed here for the Chlamydia-like bacterium from L. lineata, i.e., "Candidatus Similichlamydia latridicola" gen. nov., sp. nov.
Resumo:
We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identity-based encryption (HIBE) schemes, with and without random oracles. The resulting systems are very different from earlier lattice-based HIBEs and in some cases result in shorter ciphertexts and private keys. We prove security from classic lattice hardness assumptions.
Resumo:
In Thomas Mann’s tetralogy of the 1930s and 1940s, Joseph and His Brothers, the narrator declares history is not only “that which has happened and that which goes on happening in time,” but it is also “the stratified record upon which we set our feet, the ground beneath us.” By opening up history to its spatial, geographical, and geological dimensions Mann both predicts and encapsulates the twentieth-century’s “spatial turn,” a critical shift that divested geography of its largely passive role as history’s “stage” and brought to the fore intersections between the humanities and the earth sciences. In this paper, I draw out the relationships between history, narrative, geography, and geology revealed by this spatial turn and the questions these pose for thinking about the disciplinary relationship between geography and the humanities. As Mann’s statement exemplifies, the spatial turn itself has often been captured most strikingly in fiction, and I would argue nowhere more so than in Graham Swift’s Waterland (1983) and Anne Michaels’s Fugitive Pieces (1996), both of which present space, place, and landscape as having a palpable influence on history and memory. The geographical/geological line that runs through both Waterland and Fugitive Pieces continues through Tim Robinson’s non-fictional, two-volume “topographical” history Stones of Aran. Robinson’s Stones of Aran—which is not history, not geography, and not literature, and yet is all three—constructs an imaginative geography that renders inseparable geography, geology, history, memory, and the act of writing.
Resumo:
Numeric set watermarking is a way to provide ownership proof for numerical data. Numerical data can be considered to be primitives for multimedia types such as images and videos since they are organized forms of numeric information. Thereby, the capability to watermark numerical data directly implies the capability to watermark multimedia objects and discourage information theft on social networking sites and the Internet in general. Unfortunately, there has been very limited research done in the field of numeric set watermarking due to underlying limitations in terms of number of items in the set and LSBs in each item available for watermarking. In 2009, Gupta et al. proposed a numeric set watermarking model that embeds watermark bits in the items of the set based on a hash value of the items’ most significant bits (MSBs). If an item is chosen for watermarking, a watermark bit is embedded in the least significant bits, and the replaced bit is inserted in the fractional value to provide reversibility. The authors show their scheme to be resilient against the traditional subset addition, deletion, and modification attacks as well as secondary watermarking attacks. In this paper, we present a bucket attack on this watermarking model. The attack consists of creating buckets of items with the same MSBs and determine if the items of the bucket carry watermark bits. Experimental results show that the bucket attack is very strong and destroys the entire watermark with close to 100% success rate. We examine the inherent weaknesses in the watermarking model of Gupta et al. that leave it vulnerable to the bucket attack and propose potential safeguards that can provide resilience against this attack.
Resumo:
Transposable elements, which are DNA sequences that can move between different sites in genomes, comprise approximately 40% of the genome of mammals and are emerging as important contributors to biological diversity. Here we report a transcription unit lying within intron 1 of the murine Magi1 (membrane associated guanylate kinase inverted 1) gene that codes for a cell-cell junction scaffolding protein. The transcription unit, termed Magi1OS (Magi1 Opposite Strand), originates from a region with tandem B1 short interspersed nuclear elements (SINEs) and is an antisense gene to Magi1. Mag1OS transcription initiates in a proximal B1 element that shows only 4% divergence from the consensus sequence, indicating that it has been recently inserted into the mouse genome and could be replication competent. Moreover, a chimaeric transcript may result from intra-chromosomal interaction and trans-splicing of the Magi1 antisense transcript (Magi1OS) and Ghrl, which codes for the multifunctional peptide hormone ghrelin. These two genes are 20 megabases apart on chromosome 6 and are transcribed in opposite directions. We propose that the Magi1OS locus may serve as a useful model system to study exaptation and retrotransposition of B1 SINEs, as well as to examine the mechanisms of intra-chromosomal trans-splicing.
Resumo:
A sub‒domain smoothed Galerkin method is proposed to integrate the advantages of mesh‒free Galerkin method and FEM. Arbitrarily shaped sub‒domains are predefined in problems domain with mesh‒free nodes. In each sub‒domain, based on mesh‒free Galerkin weak formulation, the local discrete equation can be obtained by using the moving Kriging interpolation, which is similar to the discretization of the high‒order finite elements. Strain smoothing technique is subsequently applied to the nodal integration of sub‒domain by dividing the sub‒domain into several smoothing cells. Moreover, condensation of DOF can also be introduced into the local discrete equations to improve the computational efficiency. The global governing equations of present method are obtained on the basis of the scheme of FEM by assembling all local discrete equations of the sub‒domains. The mesh‒free properties of Galerkin method are retained in each sub‒domain. Several 2D elastic problems have been solved on the basis of this newly proposed method to validate its computational performance. These numerical examples proved that the newly proposed sub‒domain smoothed Galerkin method is a robust technique to solve solid mechanics problems based on its characteristics of high computational efficiency, good accuracy, and convergence.