123 resultados para PLATFORMS
Resumo:
In this paper, I question modes of listening in network music performance environments, and specifically draw on my experience as a performer listening in these scenarios. I situate network listening within the context of current music making, and refer to changes in compositional practices that draw specific attention to listening. I argue that some of these compositional developments play a determining role in articulating a new discourse of listening. Eric Satie's concept of Furniture Music, Schaeffer's ideas on reduced listening, Oliveros' deep listening practices as well as digital music platforms all serve to show a development towards a proliferation in listening experiences. I expand this narrative to listening practices in network performance environments, and identify a specific bodily fragility in listening in and to the network. This fragile state of listening and de-centered kind of performative being allow me to draw parallels to the Japanese art form Butoh and Elaine Scarry's metaphor of beauty. My own performance experiences, set within the context of several critical texts, allow me to see network[ed] listening as an ideal corporeal state, which offers a rethinking of linear conceptions of the other and a subject's own relation with her world. Ultimately, network[ed] listening posits listening as a corporeal and multi-dimensional experience that is continuously being re-shaped by technological, socio-political and cultural concerns.
Resumo:
This focused review article discusses in detail, all available high-resolution small molecule ligand/G-quadruplex structural data derived from crystallographic and NMR based techniques, in an attempt to understand key factors in ligand binding and to highlight the biological importance of these complexes. In contrast to duplex DNA, G-quadruplexes are four-stranded nucleic acid structures folded from guanine rich repeat sequences stabilized by the stacking of guanine G-quartets and extensive Watson-Crick/Hoogsteen hydrogen bonding. Thermally stable, these topologies can play a role in telomere regulation and gene expression. The core structures of G-quadruplexes form stable scaffolds while the loops have been shown, by the addition of small molecule ligands, to be sufficiently adaptable to generate new and extended binding platforms for ligands to associate, either by extending G-quartet surfaces or by forming additional planar dinucleotide pairings. Many of these structurally characterised loop rearrangements were totally unexpected opening up new opportunities for the design of selective ligands. However these rearrangements do significantly complicate attempts to rationally design ligands against well defined but unbound topologies, as seen for the series of napthalene diimides complexes. Drawing together previous findings and with the introduction of two new crystallographic quadruplex/ligand structures we aim to expand the understanding of possible structural adaptations available to quadruplexes in the presence of ligands, thereby aiding in the design of new selective entities. (C) 2011 Elsevier Masson SAS. All rights reserved.
Resumo:
Optical techniques toward the realization of sensitive and selective biosensing platforms have received considerable attention in recent times. Techniques based on interferometry, surface plasmon resonance, and waveguides have all proved popular, while spectroscopy in particular offers much potential. Raman spectroscopy is an information-rich technique in which the vibrational frequencies reveal much about the structure of a compound, but it is a weak process and offers poor sensitivity. In response to this problem, surface-enhanced Raman scattering (SERS) has received much attention, due to significant increases in sensitivity instigated by bringing the sample into contact with an enhancing substrate. Here we discuss a facile and rapid technique for the detection of pterins using SERS-active colloidal silver suspensions. Pterins are a family of biological compounds that are employed in nature in color pigmentation and as facilitators in metabolic pathways. In this work, small volumes of xanthopterin, isoxanthopterin, and 7,8-dihydrobiopterin have been examined while adsorbed to silver colloids. Limits of detection have been examined for both xanthopterin and isoxanthopterin using a 10-s exposure to a 12 mW 532 nm laser, which, while showing a trade-off between scan time and signal intensity, still provides the opportunity for the investigation of simultaneous detection of both pterins in solution. (C) 2011 Society of Photo-Optical Instrumentation Engineers (SPIE). [DOI: 10.1117/1.3600658]
Resumo:
Many different immunochemical platforms exist for the screening of naturally occurring contaminants in food from the low cost enzyme linked immunosorbent assays (ELISA) to the expensive instruments such as optical biosensors based on the phenomenon of surface plasmon resonance (SPR). The primary aim of this study was to evaluate and compare a number of these platforms to assess their accuracy and precision when applied to naturally contaminated samples containing HT-2/T-2 mycotoxins. Other important factors considered were the speed of analysis, ease of use (sample preparation techniques and use of the equipment) and ultimately the cost implications. The three screening procedures compared included an SPR biosensor assay, a commercially available ELISA and an enzyme-linked immunomagnetic electrochemical array (ELIME array). The qualitative data for all methods demonstrated very good overall agreements with each other, however on comparison with mass spectrometry confirmatory results, the ELISA and SPR assay performed slightly better than the ELIME array, exhibiting an overall agreement of 95.8% compared to 91.7%. Currently, SPR is more costly than the other two platforms and can only be used in the laboratory whereas in theory both the ELISA and ELIME array are portable and can be used in the field, but ultimately this is dependent on the sample preparation techniques employed. Sample preparative techniques varied for all methods evaluated, the ELISA was the most simple to perform followed by that of the SPR method. The ELIME array involved an additional clean-up step thereby increasing both the time and cost of analysis. Therefore in the current format, field use would not be an option for the ELIME array. In relation to speed of analysis, the ELISA outperformed the other methods.
Resumo:
Traditional static analysis fails to auto-parallelize programs with a complex control and data flow. Furthermore, thread-level parallelism in such programs is often restricted to pipeline parallelism, which can be hard to discover by a programmer. In this paper we propose a tool that, based on profiling information, helps the programmer to discover parallelism. The programmer hand-picks the code transformations from among the proposed candidates which are then applied by automatic code transformation techniques.
This paper contributes to the literature by presenting a profiling tool for discovering thread-level parallelism. We track dependencies at the whole-data structure level rather than at the element level or byte level in order to limit the profiling overhead. We perform a thorough analysis of the needs and costs of this technique. Furthermore, we present and validate the belief that programs with complex control and data flow contain significant amounts of exploitable coarse-grain pipeline parallelism in the program’s outer loops. This observation validates our approach to whole-data structure dependencies. As state-of-the-art compilers focus on loops iterating over data structure members, this observation also explains why our approach finds coarse-grain pipeline parallelism in cases that have remained out of reach for state-of-the-art compilers. In cases where traditional compilation techniques do find parallelism, our approach allows to discover higher degrees of parallelism, allowing a 40% speedup over traditional compilation techniques. Moreover, we demonstrate real speedups on multiple hardware platforms.
Resumo:
The requirement for the use of Virtual Engineering, encompassing the construction of Virtual Prototypes using Multidisciplinary Design Optimisation, for the development of future aerospace platforms and systems is discussed. Some of the activities at the Virtual Engineering Centre, a University of Liverpool initiative, are described and a number of case studies involving a range of applications of Virtual Engineering illustrated.
Resumo:
The use of dataflow digital signal processing system modelling
and synthesis techniques has been a fruitful research theme for many years and has yielded many powerful rapid system synthesis and optimisation capabilities. However, recent years have seen the spectrum of languages and techniques splinter in an application specific manner, resulting in an ad-hoc design process which is increasingly dependent on the particular application under development. This poses a major problem for automated toolflows attempting to provide rapid system synthesis for a wide ranges of applications. By analysing a number of dataflow FPGA implementation case studies, this paper shows that despit ethis common traits may be found in current techniques, which fall largely into three classes. Further, it exposes limitations pertaining to their ability to adapt algorith models to implementations for different operating environments and target platforms.
Resumo:
Thermoresponsive polymeric platforms are used to optimise drug delivery in pharmaceutical systems and bioactive medical devices. However, the practical application of these systems is compromised by their poor mechanical properties. This study describes the design of thermoresponsive semi-interpenetrating polymer networks (s-IPNs) based on cross-linked p(NIPAA) or p(NIPAA-co-HEMA) hydrogels containing poly(e-caprolactone) designed to address this issue. Using DSC, the lower critical solution temperature of the co-polymer and p(NIPAA) matrices were circa 34 °C and 32 °C, respectively. PCL was physically dispersed within the hydrogel matrices as confirmed using confocal scanning laser microscopy and DSC and resulted in marked changes in the mechanical properties (ultimate tensile strength, Young's modulus) without adversely compromising the elongation properties. P(NIPAA) networks containing dispersed PCL exhibited thermoresponsive swelling properties following immersion in buffer (pH 7), with the equilibrium-swelling ratio being greater at 20 °C than 37 °C and greatest for p(NIPAA)/PCL systems at 20 °C. The incorporation of PCL significantly lowered the equilibrium swelling ratio of the various networks but this was not deemed practically significant for s-IPNs based on p(NIPAA). Thermoresponsive release of metronidazole was observed from s-IPN composed of p(NIPAA)/PCL at 37 °C but not from p(NIPAA-co-HEMA)/PCL at this temperature. In all other platforms, drug release at 20 °C was significantly similar to that at 37 °C and was diffusion controlled. This study has uniquely described a strategy by which thermoresponsive drug release may be performed from polymeric platforms with highly elastic properties. It is proposed that these materials may be used clinically as bioactive endotracheal tubes, designed to offer enhanced resistance to ventilator associated pneumonia, a clinical condition associated with the use of endotracheal tubes where stimulus responsive drug release from biomaterials of significant mechanical properties would be advantageous. © 2012 Elsevier B.V. All rights reserved.
Resumo:
This study highlights the potential associated with utilising multi-component polymeric gels to formulate materials that possess unique rheological and mechanical properties. The synergistic effect* and interaction between hydroxyethylcellulose (HEC) and sodium carboxymethylcellulose (NaCMC), polymers which are commonly employed as drug delivery platforms for implantable medical devices (1), have been determined using dynamic, continuous shear and texture profile analysis. * The difference between the actual response of a binary mixture and the sum of the two components comprising the mixture Increases in polymer concentration resulted in an increase in G', G? and ?' whereas tan d decreased. Similarly, significant increases were also apparent in continuous shear and texture analysis. All binary mixtures showed positive synergy values which may suggest associative interaction between the two components.
Resumo:
The initial part of this paper reviews the early challenges (c 1980) in achieving real-time silicon implementations of DSP computations. In particular, it discusses research on application specific architectures, including bit level systolic circuits that led to important advances in achieving the DSP performance levels then required. These were many orders of magnitude greater than those achievable using programmable (including early DSP) processors, and were demonstrated through the design of commercial digital correlator and digital filter chips. As is discussed, an important challenge was the application of these concepts to recursive computations as occur, for example, in Infinite Impulse Response (IIR) filters. An important breakthrough was to show how fine grained pipelining can be used if arithmetic is performed most significant bit (msb) first. This can be achieved using redundant number systems, including carry-save arithmetic. This research and its practical benefits were again demonstrated through a number of novel IIR filter chip designs which at the time, exhibited performance much greater than previous solutions. The architectural insights gained coupled with the regular nature of many DSP and video processing computations also provided the foundation for new methods for the rapid design and synthesis of complex DSP System-on-Chip (SoC), Intellectual Property (IP) cores. This included the creation of a wide portfolio of commercial SoC video compression cores (MPEG2, MPEG4, H.264) for very high performance applications ranging from cell phones to High Definition TV (HDTV). The work provided the foundation for systematic methodologies, tools and design flows including high-level design optimizations based on "algorithmic engineering" and also led to the creation of the Abhainn tool environment for the design of complex heterogeneous DSP platforms comprising processors and multiple FPGAs. The paper concludes with a discussion of the problems faced by designers in developing complex DSP systems using current SoC technology. © 2007 Springer Science+Business Media, LLC.
Resumo:
Background: Popular approaches in human tissue-based biomarker discovery include tissue microarrays (TMAs) and DNA Microarrays (DMAs) for protein and gene expression profiling respectively. The data generated by these analytic platforms, together with associated image, clinical and pathological data currently reside on widely different information platforms, making searching and cross-platform analysis difficult. Consequently, there is a strong need to develop a single coherent database capable of correlating all available data types.
Method: This study presents TMAX, a database system to facilitate biomarker discovery tasks. TMAX organises a variety of biomarker discovery-related data into the database. Both TMA and DMA experimental data are integrated in TMAX and connected through common DNA/protein biomarkers. Patient clinical data (including tissue pathological data), computer assisted tissue image and associated analytic data are also included in TMAX to enable the truly high throughput processing of ultra-large digital slides for both TMAs and whole slide tissue digital slides. A comprehensive web front-end was built with embedded XML parser software and predefined SQL queries to enable rapid data exchange in the form of standard XML files.
Results & Conclusion: TMAX represents one of the first attempts to integrate TMA data with public gene expression experiment data. Experiments suggest that TMAX is robust in managing large quantities of data from different sources (clinical, TMA, DMA and image analysis). Its web front-end is user friendly, easy to use, and most importantly allows the rapid and easy data exchange of biomarker discovery related data. In conclusion, TMAX is a robust biomarker discovery data repository and research tool, which opens up the opportunities for biomarker discovery and further integromics research.
Resumo:
The cytogenetically normal subtype of acute myeloid leukemia (CN-AML) is associated with Intermediate risk which complicates therapeutic options. Lower overall HOX/TALE expression appears to correlate with more favorable prognosis/better response to treatment in some leukemias and solid cancer. The functional significance of the associated gene expression and response to chemotherapy is not known. Three independent microarray datasets obtained from large patient cohorts along with quantitative PCR validation was used to identify a four gene HOXA/TALE signature capable of prognostic stratification. Biochemical analysis was used to identify interactions between the four encoded proteins and targeted knockdown used to examine the functional importance of sustained expression of the signature in leukemia maintenance and response to chemotherapy. An eleven HOXA/TALE code identified in an Intermediate risk (n=315) compared to a Favourable group of patients (n=105) was reduced to a four gene signature of HOXA6, HOXA9, PBX3 and MEIS1 by iterative analysis of independent platforms. This signature maintained the Favorable/Intermediate risk partition and where applicable, correlated with overall survival in CN-AML. We further show that cell growth and function is dependent on maintained levels of these core genes and that direct targeting of HOXA/PBX3 sensitizes CN-AML cells to standard chemotherapy. Together the data support a key role for HOXA/TALE in CN-AML and demonstrate that targeting of clinically significant HOXA/PBX3 elements may provide therapeutic benefit to these patients.
Resumo:
The spatial distributions of marine fauna and of pollution are both highly structured, and thus the resulting high levels of autocorrelation may invalidate conclusions based on classical statistical approaches. Here we analyse the close correlation observed between proxies for the disturbance associated with gas extraction activities and amphipod distribution patterns around four hydrocarbon platforms. We quantified the amount of variation independently accounted for by natural environmental variables, proxies for the disturbance caused by platforms, and spatial autocorrelation. This allowed us to demonstrate how each of these three factors significantly affects the community structure of amphipods. Sophisticated statistical techniques are required when taking into account spatial autocorrelation: nevertheless our data demonstrate that this approach not only enables the formulation of robust statistical inferences but also provides a much deeper understanding of the subtle interactions between human disturbance and natural factors affecting the structure of marine invertebrates communities. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Au nanoparticles (AuNPs) have attracted a great interest in fabrication of various biosensor systems for analysis of cellular and biomolecular recognitions. In conjunction with vast conjugation chemistry available, the materials are easily coupled with biomolecules such as nucleic acids, antigens or antibodies in order to achieve their many potential applications as ligand carriers or transducing platforms for preparation, detection and quantification purposes. Furthermore, the nanoparticles possess easily tuned and unique optical/ physical/ chemical characteristics, and high surface areas, making them ideal candidates to this end. In this topic, sensing mechanisms based on localized surface plasmon resonance (LSPR), particle aggregation, catalytic property, and Fluorescence Resonance Energy Transfer (FRET) of AuNPs as well as barcoding technologies including DNA biobarcodes will be discussed.
Resumo:
The advent of next generation sequencing technologies (NGS) has expanded the area of genomic research, offering high coverage and increased sensitivity over older microarray platforms. Although the current cost of next generation sequencing is still exceeding that of microarray approaches, the rapid advances in NGS will likely make it the platform of choice for future research in differential gene expression. Connectivity mapping is a procedure for examining the connections among diseases, genes and drugs by differential gene expression initially based on microarray technology, with which a large collection of compound-induced reference gene expression profiles have been accumulated. In this work, we aim to test the feasibility of incorporating NGS RNA-Seq data into the current connectivity mapping framework by utilizing the microarray based reference profiles and the construction of a differentially expressed gene signature from a NGS dataset. This would allow for the establishment of connections between the NGS gene signature and those microarray reference profiles, alleviating the associated incurring cost of re-creating drug profiles with NGS technology. We examined the connectivity mapping approach on a publicly available NGS dataset with androgen stimulation of LNCaP cells in order to extract candidate compounds that could inhibit the proliferative phenotype of LNCaP cells and to elucidate their potential in a laboratory setting. In addition, we also analyzed an independent microarray dataset of similar experimental settings. We found a high level of concordance between the top compounds identified using the gene signatures from the two datasets. The nicotine derivative cotinine was returned as the top candidate among the overlapping compounds with potential to suppress this proliferative phenotype. Subsequent lab experiments validated this connectivity mapping hit, showing that cotinine inhibits cell proliferation in an androgen dependent manner. Thus the results in this study suggest a promising prospect of integrating NGS data with connectivity mapping. © 2013 McArt et al.