766 resultados para Pierson, Barry
Resumo:
In this paper we address energy efficiency issues of Information Centric Networking (ICN) architectures. In the proposed framework, we investigate the impact of ICN architectures on energy consumption of networking hardware devices and compare them with the energy consumption of other content dissemination methods. In particular, we investigate the consequences of caching in ICN from the energy efficiency perspective, taking into account the energy consumption of different hardware components in the ICN architectures. Based on the results of the analysis, we address the practical issues regarding the possible deployment and evolution of ICN from an energy-efficiency perspective. Finally, we summarize our findings and discuss the outlook/future perspectives on the energy efficiency of Information-Centric Networks.
Resumo:
This paper proposes the Optimized Power save Algorithm for continuous Media Applications (OPAMA) to improve end-user device energy efficiency. OPAMA enhances the standard legacy Power Save Mode (PSM) of IEEE 802.11 by taking into consideration application specific requirements combined with data aggregation techniques. By establishing a balanced cost/benefit tradeoff between performance and energy consumption, OPAMA is able to improve energy efficiency, while keeping the end-user experience at a desired level. OPAMA was assessed in the OMNeT++ simulator using real traces of variable bitrate video streaming applications. The results showed the capability to enhance energy efficiency, achieving savings up to 44% when compared with the IEEE 802.11 legacy PSM.
Resumo:
In sheep, small ruminant lentiviruses cause an incurable, progressive, lymphoproliferative disease that affects millions of animals worldwide. Known as ovine progressive pneumonia virus (OPPV) in the U.S., and Visna/Maedi virus (VMV) elsewhere, these viruses reduce an animal's health, productivity, and lifespan. Genetic variation in the ovine transmembrane protein 154 gene (TMEM154) has been previously associated with OPPV infection in U.S. sheep. Sheep with the ancestral TMEM154 haplotype encoding glutamate (E) at position 35, and either form of an N70I variant, were highly-susceptible compared to sheep homozygous for the K35 missense mutation. Our current overall aim was to characterize TMEM154 in sheep from around the world to develop an efficient genetic test for reduced susceptibility. The average frequency of TMEM154 E35 among 74 breeds was 0.51 and indicated that highly-susceptible alleles were present in most breeds around the world. Analysis of whole genome sequences from an international panel of 75 sheep revealed more than 1,300 previously unreported polymorphisms in a 62 kb region containing TMEM154 and confirmed that the most susceptible haplotypes were distributed worldwide. Novel missense mutations were discovered in the signal peptide (A13V) and the extracellular domains (E31Q, I74F, and I102T) of TMEM154. A matrix-assisted laser desorption/ionization-time-of flight mass spectrometry (MALDI-TOF MS) assay was developed to detect these and six previously reported missense and two deletion mutations in TMEM154. In blinded trials, the call rate for the eight most common coding polymorphisms was 99.4% for 499 sheep tested and 96.0% of the animals were assigned paired TMEM154 haplotypes (i.e., diplotypes). The widespread distribution of highly-susceptible TMEM154 alleles suggests that genetic testing and selection may improve the health and productivity of infected flocks.
Resumo:
Molecular data are now widely used in epidemiological studies to investigate the transmission, distribution, biology, and diversity of pathogens. Our objective was to establish recommendations to support good scientific reporting of molecular epidemiological studies to encourage authors to consider specific threats to valid inference. The statement Strengthening the Reporting of Molecular Epidemiology for Infectious Diseases (STROME-ID) builds upon the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) initiative. The STROME-ID statement was developed by a working group of epidemiologists, statisticians, bioinformaticians, virologists, and microbiologists with expertise in control of infection and communicable diseases. The statement focuses on issues relating to the reporting of epidemiological studies of infectious diseases using molecular data that were not addressed by STROBE. STROME-ID addresses terminology, measures of genetic diversity within pathogen populations, laboratory methods, sample collection, use of molecular markers, molecular clocks, timeframe, multiple-strain infections, non-independence of infectious-disease data, missing data, ascertainment bias, consistency between molecular and epidemiological data, and ethical considerations with respect to infectious-disease research. In total, 20 items were added to the 22 item STROBE checklist. When used, the STROME-ID recommendations should advance the quality and transparency of scientific reporting, with clear benefits for evidence reviews and health-policy decision making.
Resumo:
A two-pronged approach for the automatic quantitation of multiple sclerosis (MS) lesions on magnetic resonance (MR) images has been developed. This method includes the design and use of a pulse sequence for improved lesion-to-tissue contrast (LTC) and seeks to identify and minimize the sources of false lesion classifications in segmented images. The new pulse sequence, referred to as AFFIRMATIVE (Attenuation of Fluid by Fast Inversion Recovery with MAgnetization Transfer Imaging with Variable Echoes), improves the LTC, relative to spin-echo images, by combining Fluid-Attenuated Inversion Recovery (FLAIR) and Magnetization Transfer Contrast (MTC). In addition to acquiring fast FLAIR/MTC images, the AFFIRMATIVE sequence simultaneously acquires fast spin-echo (FSE) images for spatial registration of images, which is necessary for accurate lesion quantitation. Flow has been found to be a primary source of false lesion classifications. Therefore, an imaging protocol and reconstruction methods are developed to generate "flow images" which depict both coherent (vascular) and incoherent (CSF) flow. An automatic technique is designed for the removal of extra-meningeal tissues, since these are known to be sources of false lesion classifications. A retrospective, three-dimensional (3D) registration algorithm is implemented to correct for patient movement which may have occurred between AFFIRMATIVE and flow imaging scans. Following application of these pre-processing steps, images are segmented into white matter, gray matter, cerebrospinal fluid, and MS lesions based on AFFIRMATIVE and flow images using an automatic algorithm. All algorithms are seamlessly integrated into a single MR image analysis software package. Lesion quantitation has been performed on images from 15 patient volunteers. The total processing time is less than two hours per patient on a SPARCstation 20. The automated nature of this approach should provide an objective means of monitoring the progression, stabilization, and/or regression of MS lesions in large-scale, multi-center clinical trials. ^
Resumo:
Many studies in biostatistics deal with binary data. Some of these studies involve correlated observations, which can complicate the analysis of the resulting data. Studies of this kind typically arise when a high degree of commonality exists between test subjects. If there exists a natural hierarchy in the data, multilevel analysis is an appropriate tool for the analysis. Two examples are the measurements on identical twins, or the study of symmetrical organs or appendages such as in the case of ophthalmic studies. Although this type of matching appears ideal for the purposes of comparison, analysis of the resulting data while ignoring the effect of intra-cluster correlation has been shown to produce biased results.^ This paper will explore the use of multilevel modeling of simulated binary data with predetermined levels of correlation. Data will be generated using the Beta-Binomial method with varying degrees of correlation between the lower level observations. The data will be analyzed using the multilevel software package MlwiN (Woodhouse, et al, 1995). Comparisons between the specified intra-cluster correlation of these data and the estimated correlations, using multilevel analysis, will be used to examine the accuracy of this technique in analyzing this type of data. ^
Resumo:
After 5 years of conceptualizing, investigating, and writing about corrective experiences (CEs), we (the authors of this chapter) met to talk about what we learned. In this chapter, we summarize our joint understanding of (a) the definition of CEs; (b) the contexts in which CEs occur; (c) client, therapist, and external factors that facilitate CEs; (d) the consequences of CEs; and (e) ideas for future theoretical, clinical, empirical, and training directions. As will become evident, the authors of this chapter, who represent a range of theoretical orientations, reached consensus on some CE-related topics but encountered controversy and lively debate about other topics. (PsycINFO Database Record (c) 2013 APA, all rights reserved)
Resumo:
Well-being is an important component of physical and psychological health and an important source for individual development. The article aims to give an overview of different research traditions and definitions of well-being and to outline the basic ideas of research into well-being. It also examines well-being in school, the sources and predictors of well-being, as well as the function of well-being in educational settings. Both student and teacher well-being are considered.
Resumo:
A dynamical model, developed to account for the observed major variations of global ice mass and atmospheric CO2 during the late Cenozoic, is used to provide a quantitative demonstration of the possibility that the anthropogenically-forced increase of atmospheric CO2, if maintained over a long period of time (perhaps by tectonic forcing), could displace the climatic system from an unstable regime of oscillating ice ages into a more stable regime representative of the pre-Pleistocene. This stable regime is characterized by orbitally-forced oscillations that are of much weaker amplitude than prevailed during the Pleistocene.
Resumo:
Source materials like fine art, over-sized, fragile maps, and delicate artifacts have traditionally been digitally converted through the use of controlled lighting and high resolution scanners and camera backs. In addition the capture of items such as general and special collections bound monographs has recently grown both through consortial efforts like the Internet Archive's Open Content Alliance and locally at the individual institution level. These projects, in turn, have introduced increasingly higher resolution consumer-grade digital single lens reflex cameras or "DSLRs" as a significant part of the general cultural heritage digital conversion workflow. Central to the authors' discussion is the fact that both camera backs and DSLRs commonly share the ability to capture native raw file formats. Because these formats include such advantages as access to an image's raw mosaic sensor data within their architecture, many institutions choose raw for initial capture due to its high bit-level and unprocessed nature. However to date these same raw formats, so important to many at the point of capture, have yet to be considered "archival" within most published still imaging standards, if they are considered at all. Throughout many workflows raw files are deleted and thrown away after more traditionally "archival" uncompressed TIFF or JPEG 2000 files have been derived downstream from their raw source formats [1][2]. As a result, the authors examine the nature of raw anew and consider the basic questions, Should raw files be retained? What might their role be? Might they in fact form a new archival format space? Included in the discussion is a survey of assorted raw file types and their attributes. Also addressed are various sustainability issues as they pertain to archival formats with a special emphasis on both raw's positive and negative characteristics as they apply to archival practices. Current common archival workflows versus possible raw-based ones are investigated as well. These comparisons are noted in the context of each approach's differing levels of usable captured image data, various preservation virtues, and the divergent ideas of strictly fixed renditions versus the potential for improved renditions over time. Special attention is given to the DNG raw format through a detailed inspection of a number of its various structural components and the roles that they play in the format's latest specification. Finally an evaluation is drawn of both proprietary raw formats in general and DNG in particular as possible alternative archival formats for still imaging.