47 resultados para Software-based techniques
Resumo:
Smartphones have undergone a remarkable evolution over the last few years, from simple calling devices to full fledged computing devices where multiple services and applications run concurrently. Unfortunately, battery capacity increases at much slower pace, resulting as a main bottleneck for Internet connected smartphones. Several software-based techniques have been proposed in the literature for improving the battery life. Most common techniques include data compression, packet aggregation or batch scheduling, offloading partial computations to cloud, switching OFF interfaces (e.g., WiFi or 3G/4G) periodically for short intervals etc. However, there has been no focus on eliminating the energy waste of background applications that extensively utilize smartphone resources such as CPU, memory, GPS, WiFi, 3G/4G data connection etc. In this paper, we propose an Application State Proxy (ASP) that suppresses/stops the applications on smartphones and maintains their presence on any other network device. The applications are resumed/restarted on smartphones only in case of any event, such as a new message arrival. In this paper, we present the key requirements for the ASP service and different possible architectural designs. In short, the ASP concept can significantly improve the battery life of smartphones, by reducing to maximum extent the usage of its resources due to background applications.
Resumo:
Developing a desirable framework for handling inconsistencies in software requirements specifications is a challenging problem. It has been widely recognized that the relative priority of requirements can help developers to make some necessary trade-off decisions for resolving con- flicts. However, for most distributed development such as viewpoints-based approaches, different stakeholders may assign different levels of priority to the same shared requirements statement from their own perspectives. The disagreement in the local levels of priority assigned to the same shared requirements statement often puts developers into a dilemma during the inconsistency handling process. The main contribution of this paper is to present a prioritized merging-based framework for handling inconsistency in distributed software requirements specifications. Given a set of distributed inconsistent requirements collections with the local prioritization, we first construct a requirements specification with a prioritization from an overall perspective. We provide two approaches to constructing a requirements specification with the global prioritization, including a merging-based construction and a priority vector-based construction. Following this, we derive proposals for handling inconsistencies from the globally prioritized requirements specification in terms of prioritized merging. Moreover, from the overall perspective, these proposals may be viewed as the most appropriate to modifying the given inconsistent requirements specification in the sense of the ordering relation over all the consistent subsets of the requirements specification. Finally, we consider applying negotiation-based techniques to viewpoints so as to identify an acceptable common proposal from these proposals.
Resumo:
Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.
Resumo:
This research aims to use the multivariate geochemical dataset, generated by the Tellus project, to investigate the appropriate use of transformation methods to maintain the integrity of geochemical data and inherent constrained behaviour in multivariate relationships. The widely used normal score transform is compared with the use of a stepwise conditional transform technique. The Tellus Project, managed by GSNI and funded by the Department of Enterprise Trade and Development and the EU’s Building Sustainable Prosperity Fund, involves the most comprehensive geological mapping project ever undertaken in Northern Ireland. Previous study has demonstrated spatial variability in the Tellus data but geostatistical analysis and interpretation of the datasets requires use of an appropriate methodology that reproduces the inherently complex multivariate relations. Previous investigation of the Tellus geochemical data has included use of Gaussian-based techniques. However, earth science variables are rarely Gaussian, hence transformation of data is integral to the approach. The multivariate geochemical dataset generated by the Tellus project provides an opportunity to investigate the appropriate use of transformation methods, as required for Gaussian-based geostatistical analysis. In particular, the stepwise conditional transform is investigated and developed for the geochemical datasets obtained as part of the Tellus project. The transform is applied to four variables in a bivariate nested fashion due to the limited availability of data. Simulation of these transformed variables is then carried out, along with a corresponding back transformation to original units. Results show that the stepwise transform is successful in reproducing both univariate statistics and the complex bivariate relations exhibited by the data. Greater fidelity to multivariate relationships will improve uncertainty models, which are required for consequent geological, environmental and economic inferences.
Resumo:
Reliability has emerged as a critical design constraint especially in memories. Designers are going to great lengths to guarantee fault free operation of the underlying silicon by adopting redundancy-based techniques, which essentially try to detect and correct every single error. However, such techniques come at a cost of large area, power and performance overheads which making many researchers to doubt their efficiency especially for error resilient systems where 100% accuracy is not always required. In this paper, we present an alternative method focusing on the confinement of the resulting output error induced by any reliability issues. By focusing on memory faults, rather than correcting every single error the proposed method exploits the statistical characteristics of any target application and replaces any erroneous data with the best available estimate of that data. To realize the proposed method a RISC processor is augmented with custom instructions and special-purpose functional units. We apply the method on the proposed enhanced processor by studying the statistical characteristics of the various algorithms involved in a popular multimedia application. Our experimental results show that in contrast to state-of-the-art fault tolerance approaches, we are able to reduce runtime and area overhead by 71.3% and 83.3% respectively.
Resumo:
Accurate and efficient grid based techniques for the solution of the time-dependent Schrodinger equation for few-electron diatomic molecules irradiated by intense, ultrashort laser pulses are described. These are based on hybrid finite-difference, Lagrange mesh techniques. The methods are applied in three scenarios, namely H-2(+) with fixed internuclear separation, H-2(+) with vibrating nuclei and H-2 with fixed internuclear separation and illustrative results presented.
Resumo:
In this paper, we present a new approach to visual speech recognition which improves contextual modelling by combining Inter-Frame Dependent and Hidden Markov Models. This approach captures contextual information in visual speech that may be lost using a Hidden Markov Model alone. We apply contextual modelling to a large speaker independent isolated digit recognition task, and compare our approach to two commonly adopted feature based techniques for incorporating speech dynamics. Results are presented from baseline feature based systems and the combined modelling technique. We illustrate that both of these techniques achieve similar levels of performance when used independently. However significant improvements in performance can be achieved through a combination of the two. In particular we report an improvement in excess of 17% relative Word Error Rate in comparison to our best baseline system.
Resumo:
Relatively little is known about the biology and ecology of the world's largest (heaviest) bony fish, the ocean sunfish Mola mola, despite its worldwide occurrence in temperate and tropical seas. Studies are now emerging that require many common perceptions about sunfish behaviour and ecology to be re-examined. Indeed, the long-held view that ocean sunfish are an inactive, passively drifting species seems to be entirely misplaced. Technological advances in marine telemetry are revealing distinct behavioural patterns and protracted seasonal movements. Extensive forays by ocean sunfish into the deep ocean have been documented and broad-scale surveys, together with molecular and laboratory based techniques, are addressing the connectivity and trophic role of these animals. These emerging molecular and movement studies suggest that local distinct populations may be prone to depletion through bycatch in commercial fisheries. Rising interest in ocean sunfish, highlighted by the increase in recent publications, warrants a thorough review of the biology and ecology of this species. Here we review the taxonomy, morphology, geography, diet, locomotion, vision, movements, foraging ecology, reproduction and species interactions of M. mola. We present a summary of current conservation issues and suggest methods for addressing fundamental gaps in our knowledge.
Developing a simple, rapid method for identifying and monitoring jellyfish aggregations from the air
Resumo:
Within the marine environment, aerial surveys have historically centred on apex predators, such as pinnipeds, cetaceans and sea birds. However, it is becoming increasingly apparent that the utility of this technique may also extend to subsurface species such as pre-spawning fish stocks and aggregations of jellyfish that occur close to the surface. In light of this, we tested the utility of aerial surveys to provide baseline data for 3 poorly understood scyphozoan jellyfish found throughout British and Irish waters: Rhizostoma octopus, Cyanea capillata and Chrysaora hysoscella. Our principal objectives were to develop a simple sampling protocol to identify and quantify surface aggregations, assess their consistency in space and time, and consider the overall applicability of this technique to the study of gelatinous zooplankton. This approach provided a general understanding of range and relative abundance for each target species, with greatest suitability to the study of R. octopus. For this species it was possible to identify and monitor extensive, temporally consistent and previously undocumented aggregations throughout the Irish Sea, an area spanning thousands of square kilometres. This finding has pronounced implications for ecologists and fisheries managers alike and, moreover, draws attention to the broad utility of aerial surveys for the study of gelatinous aggregations beyond the range of conventional ship-based techniques.
Resumo:
This focused review article discusses in detail, all available high-resolution small molecule ligand/G-quadruplex structural data derived from crystallographic and NMR based techniques, in an attempt to understand key factors in ligand binding and to highlight the biological importance of these complexes. In contrast to duplex DNA, G-quadruplexes are four-stranded nucleic acid structures folded from guanine rich repeat sequences stabilized by the stacking of guanine G-quartets and extensive Watson-Crick/Hoogsteen hydrogen bonding. Thermally stable, these topologies can play a role in telomere regulation and gene expression. The core structures of G-quadruplexes form stable scaffolds while the loops have been shown, by the addition of small molecule ligands, to be sufficiently adaptable to generate new and extended binding platforms for ligands to associate, either by extending G-quartet surfaces or by forming additional planar dinucleotide pairings. Many of these structurally characterised loop rearrangements were totally unexpected opening up new opportunities for the design of selective ligands. However these rearrangements do significantly complicate attempts to rationally design ligands against well defined but unbound topologies, as seen for the series of napthalene diimides complexes. Drawing together previous findings and with the introduction of two new crystallographic quadruplex/ligand structures we aim to expand the understanding of possible structural adaptations available to quadruplexes in the presence of ligands, thereby aiding in the design of new selective entities. (C) 2011 Elsevier Masson SAS. All rights reserved.
Resumo:
Conflicting results have been reported on the detection of paramyxovirus transcripts in Paget's disease, and a possible explanation is differences in the sensitivity of RT-PCR methods for detecting virus. In a blinded study, we found no evidence to suggest that laboratories that failed to detect viral transcripts had less sensitive RT-PCR assays, and we did not detect measles or distemper transcripts in Paget's samples using the most sensitive assays evaluated.
Introduction: There is conflicting evidence on the possible role of persistent paramyxovirus infection in Paget's disease of bone (PDB). Some workers have detected measles virus (MV) or canine distemper virus (CDV) transcripts in cells and tissues from patients with PDB, but others have failed to confirm this finding. A possible explanation might be differences in the sensitivity of RT-PCR methods for detecting virus. Here we performed a blinded comparison of the sensitivity of different RT-PCR-based techniques for MV and CDV detection in different laboratories and used the most sensitive assays to screen for evidence of viral transcripts in bone and blood samples derived from patients with PDB.
Materials and Methods: Participating laboratories analyzed samples spiked with known amounts of MV and CDV transcripts and control samples that did not contain viral nucleic acids. All analyses were performed on a blinded basis.
Results: The limit of detection for CDV was 1000 viral transcripts in three laboratories (Aberdeen, Belfast, and Liverpool) and 10,000 transcripts in another laboratory (Manchester). The limit of detection for MV was 16 transcripts in one laboratory (NIBSC), 1000 transcripts in two laboratories (Aberdeen and Belfast), and 10,000 transcripts in two laboratories (Liverpool and Manchester). An assay previously used by a U.S.-based group to detect MV transcripts in PDB had a sensitivity of 1000 transcripts. One laboratory (Manchester) detected CDV transcripts in a negative control and in two samples that had been spiked with MV. None of the other laboratories had false-positive results for MV or CDV, and no evidence of viral transcripts was found on analysis of 12 PDB samples using the most sensitive RT-PCR assays for MV and CDV.
Conclusions: We found that RT-PCR assays used by different laboratories differed in their sensitivity to detect CDV and MV transcripts but found no evidence to suggest that laboratories that previously failed to detect viral transcripts had less sensitive RT-PCR assays than those that detected viral transcripts. False-positive results were observed with one laboratory, and we failed to detect paramyxovirus transcripts in PDB samples using the most sensitive assays evaluated. Our results show that failure of some laboratories to detect viral transcripts is unlikely to be caused by problems with assay sensitivity and highlight the fact that contamination can be an issue when searching for pathogens by sensitive RT-PCR-based techniques.
Resumo:
This article will discuss a recent ensemble composition entitled Starbog which was toured and broadcast in Britain in 2006 . The composition of Starbog focused on developing working methods which combined computer-based techniques (using OpenMusic) with more subconscious means of generating musical ideas. The challenge in achieving this was as much aesthetic/philosophical as it was technical and the present article is intending as a ‘sounding’ which focuses on the influence OpenMusic has had on the composer’s music, rather than documenting the nature of the often simple application of algorithms.
Resumo:
Despite considerable advances in reducing the production of dioxin-like toxicants in recent years, contamination of the food chain still occasionally occurs resulting in huge losses to the agri-food sector and risk to human health through exposure. Dioxin-like toxicity is exhibited by a range of stable and bioaccumulative compounds including polychlorinated dibenzo-p-dioxins (PCDDs) and dibenzofurans (PCDFs), produced by certain types of combustion, and man-made coplanar polychlorinated biphenyls (PCBs), as found in electrical transformer oils. While dioxinergic compounds act by a common mode of action making exposure detection biomarker based techniques a potentially useful tool, the influence of co-contaminating toxicants on such approaches needs to be considered. To assess the impact of possible interactions, the biological responses of H4IIE cells to challenge by 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in combination with PCB-52 and benzo-a-pyrene (BaP) were evaluated by a number of methods in this study. Ethoxyresorufin-O-deethylase (EROD) induction in TCDD exposed cells was suppressed by increasing concentrations of PCB-52, PCB-153, or BaP up to 10 mu M. BaP levels below 1 mu M suppressed TCDD stimulated EROD induction, but at higher concentrations, EROD induction was greater than the maximum observed when cells were treated with TCDD alone. A similar biphasic interaction of BaP with TCDD co-exposure was noted in the AlamarBlue assay and to a lesser extent with PCB-52. Surface enhanced laser desorption/ionization-time of flight mass spectrometry (SELDI-TOF) profiling of peptidomic responses of cells exposed to compound combinations was compared. Cells co-exposed to TCDD in the presence of BaP or PCB-52 produced the most differentiated spectra with a substantial number of non-additive interactions observed. These findings suggest that interactions between dioxin and other toxicants create novel, additive, and non-additive effects, which may be more indicative of the types of responses seen in exposed animals than those of single exposures to the individual compounds.
Resumo:
In this paper we continue our investigation into the development of computational-science software based on the identification and formal specification of Abstract Data Types (ADTs) and their implementation in Fortran 90. In particular, we consider the consequences of using pointers when implementing a formally specified ADT in Fortran 90. Our aim is to highlight the resulting conflict between the goal of information hiding, which is central to the ADT methodology, and the space efficiency of the implementation. We show that the issue of storage recovery cannot be avoided by the ADT user, and present a range of implementations of a simple ADT to illustrate various approaches towards satisfactory storage management. Finally, we propose a set of guidelines for implementing ADTs using pointers in Fortran 90. These guidelines offer a way gracefully to provide disposal operations in Fortran 90. Such an approach is desirable since Fortran 90 does not provide automatic garbage collection which is offered by many object-oriented languages including Eiffel, Java, Smalltalk, and Simula.
Resumo:
Despite ethical and technical concerns, the in vivo method, or more commonly referred to mouse bioassay (MBA), is employed globally as a reference method for phycotoxin analysis in shellfish. This is particularly the case for paralytic shellfish poisoning (PSP) and emerging toxin monitoring. A high-performance liquid chromatography method (HPLC-FLD) has been developed for PSP toxin analysis, but due to difficulties and limitations in the method, this procedure has not been fully implemented as a replacement. Detection of the diarrhetic shellfish poisoning (DSP) toxins has moved towards LC-mass spectrometry (MS) analysis, whereas the analysis of the amnesic shellfish poisoning (ASP) toxin domoic acid is performed by HPLC. Although alternative methods of detection to the MBA have been described, each procedure is specific for a particular toxin and its analogues, with each group of toxins requiring separate analysis utilising different extraction procedures and analytical equipment. In addition, consideration towards the detection of unregulated and emerging toxins on the replacement of the MBA must be given. The ideal scenario for the monitoring of phycotoxins in shellfish and seafood would be to evolve to multiple toxin detection on a single bioanalytical sensing platform, i.e. 'an artificial mouse'. Immunologically based techniques and in particular surface plasmon resonance technology have been shown as a highly promising bioanalytical tool offering rapid, real-time detection requiring minimal quantities of toxin standards. A Biacore Q and a prototype multiplex SPR biosensor have been evaluated for their ability to be fit for purpose for the simultaneous detection of key regulated phycotoxin groups and the emerging toxin palytoxin. Deemed more applicable due to the separate flow channels, the prototype performance for domoic acid, okadaic acid, saxitoxin, and palytoxin calibration curves in shellfish achieved detection limits (IC20) of 4,000, 36, 144 and 46 μg/kg of mussel, respectively. A one-step extraction procedure demonstrated recoveries greater than 80 % for all toxins. For validation of the method at the 95 % confidence limit, the decision limits (CCα) determined from an extracted matrix curve were calculated to be 450, 36 and 24 μg/kg, and the detection capability (CCβ) as a screening method is ≤10 mg/kg, ≤160 μg/kg and ≤400 μg/kg for domoic acid, okadaic acid and saxitoxin, respectively.