964 resultados para norm-based coding


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Recent reports using administrative claims data suggest the incidence of community- and hospital-onset sepsis is increasing. Whether this reflects changing epidemiology, more effective diagnostic methods, or changes in physician documentation and medical coding practices is unclear. METHODS We performed a temporal-trend study from 2008 to 2012 using administrative claims data and patient-level clinical data of adult patients admitted to Barnes-Jewish Hospital in St. Louis, Missouri. Temporal-trend and annual percent change were estimated using regression models with autoregressive integrated moving average errors. RESULTS We analyzed 62,261 inpatient admissions during the 5-year study period. 'Any SIRS' (i.e., SIRS on a single calendar day during the hospitalization) and 'multi-day SIRS' (i.e., SIRS on 3 or more calendar days), which both use patient-level data, and medical coding for sepsis (i.e., ICD-9-CM discharge diagnosis codes 995.91, 995.92, or 785.52) were present in 35.3 %, 17.3 %, and 3.3 % of admissions, respectively. The incidence of admissions coded for sepsis increased 9.7 % (95 % CI: 6.1, 13.4) per year, while the patient data-defined events of 'any SIRS' decreased by 1.8 % (95 % CI: -3.2, -0.5) and 'multi-day SIRS' did not change significantly over the study period. Clinically-defined sepsis (defined as SIRS plus bacteremia) and severe sepsis (defined as SIRS plus hypotension and bacteremia) decreased at statistically significant rates of 5.7 % (95 % CI: -9.0, -2.4) and 8.6 % (95 % CI: -4.4, -12.6) annually. All-cause mortality, SIRS mortality, and SIRS and clinically-defined sepsis case fatality did not change significantly during the study period. Sepsis mortality, based on ICD-9-CM codes, however, increased by 8.8 % (95 % CI: 1.9, 16.2) annually. CONCLUSIONS The incidence of sepsis, defined by ICD-9-CM codes, and sepsis mortality increased steadily without a concomitant increase in SIRS or clinically-defined sepsis. Our results highlight the need to develop strategies to integrate clinical patient-level data with administrative data to draw more accurate conclusions about the epidemiology of sepsis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Genome-wide association studies have linked CYP17A1 coding for the steroid hormone synthesizing enzyme 17α-hydroxylase (CYP17A1) to blood pressure (BP). We hypothesized that the genetic signal may translate into a correlation of ambulatory BP (ABP) with apparent CYP17A1 activity in a family-based population study and estimated the heritability of CYP17A1 activity. METHODS In the Swiss Kidney Project on Genes in Hypertension, day and night urinary excretions of steroid hormone metabolites were measured in 518 participants (220 men, 298 women), randomly selected from the general population. CYP17A1 activity was assessed by 2 ratios of urinary steroid metabolites: one estimating the combined 17α-hydroxylase/17,20-lyase activity (ratio 1) and the other predominantly 17α-hydroxylase activity (ratio 2). A mixed linear model was used to investigate the association of ABP with log-transformed CYP17A1 activities exploring effect modification by urinary sodium excretion. RESULTS Daytime ABP was positively associated with ratio 1 under conditions of high, but not low urinary sodium excretion (P interaction <0.05). Ratio 2 was not associated with ABP. Heritability estimates (SE) for day and night CYP17A1 activities were 0.39 (0.10) and 0.40 (0.09) for ratio 1, and 0.71 (0.09) and 0.55 (0.09) for ratio 2 (P values <0.001). CYP17A1 activities, assessed with ratio 1, were lower in older participants. CONCLUSIONS Low apparent CYP17A1 activity (assessed with ratio 1) is associated with elevated daytime ABP when salt intake is high. CYP17A1 activity is heritable and diminished in the elderly. These observations highlight the modifying effect of salt intake on the association of CYP17A1 with BP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research project sought to answer the primary research question: What occurs when the music program in a church changes its emphasis from performance to education? This qualitative study of a church choir included participant observation of Wednesday evening and Sunday morning rehearsals over a 12 week period, individual interviews, group interviews, written responses, and written and visual assessment of musical skills. The goal was a rich description of the participants and emerging themes resulting from the shift in emphasis. Analysis of data occurred through inductive processing. Data was initially coded and then the codes were categorized into sub-themes, and finally into major themes. Early analysis of the data began with reflection in a researcher journal. Following the completion of the study the journal was entered into a word processor, as were transcriptions of videotaped rehearsals, and written reflections from the participants. After all data had been reviewed repeatedly and entered into the word processor, it was coded, reexamined, and finally categorized into sub-themes and themes. After coding and identification of major themes and sub-themes the finding were challenged by looking for disconfirming evidence. Finally, after the completion of the analysis stage, member checks were conducted. The results of the analysis of data revealed themes that could be associated either with the choir or the director. The key themes primarily associated with the choir were: Response to the change in rehearsal format; Attitude toward learning; Appropriateness of community learning model; and, Member's perceptions of the results of the program. The key themes associated with the director were identified as: Conductor assuming the role of educator; Conductor recognizing the choir as learners; Conductor treating rehearsals as a time for teaching and learning; and, Conductor's perception of the effectiveness of the change in focus. The study concluded that a change in focus from performance to education did not noticeably improve the sound of the choir after twelve-weeks. There were however, indications that improvements were being made by the individual members. Further study of the effects over a longer period of time is recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Breastfeeding and the use of human milk are widely accepted as the most complete form of nutrition for infants. Breastfeeding is shown to be associated with many positive health outcomes for both infants and mothers. Healthy People 2000 goals to increase breastfeeding rates in the early postpartum period to 75% fell short, with only 64% of mothers meeting this objective. Lack of support from healthcare providers, and unsupportive hospital policies and practices are noted as barriers to the initiation and duration of breastfeeding. The purpose of this study was to evaluate implementation of the BFHI Ten Steps to Successful Breastfeeding at Texas Children's Hospital. ^ The Baby-Friendly Hospital Initiative (BFHI) was developed in 1991 by the World Health Organization and the United Nations Children's Fund (UNICEF) to ensure that healthcare facilities offering maternity services adhere to the Ten Steps of Successful Breastfeeding and the International Code of Marketing of Breast-Milk Substitutes, and create legislation to protect the rights of breastfeeding women. The instrument used in this study was the BFHI 100 Assessment Tool created by Dr. Laura Haiek, Director of Public Health in Monteregie, Quebec, and her staff at Health and Social Services Agency of Quebec. The BFHI 100 tool utilizes 100 different indicators of compliance with BFHI through questionnaires administered to staff and administrators, pregnant and postpartum mothers, and an observer. ^ The study concluded that although there is much room for improvement in educating breastfeeding mothers, overall, the mothers interviewed were satisfied with their level of care in regards to breastfeeding support. Areas of improvement include staff training, as some nursing staff admitted to relying on the lactation consultants to provide most of the breastfeeding education for mothers. Only a small percentage of mothers interviewed reported that their baby “roomed-in” on average of 22 hours per day during their hospital stay. Staff encouragement of the rooming-in practice will help to increase the proportion of mothers who allow their babies to room-in. The current breastfeeding policy will also need to be revised and strengthened to be compliant with the Ten Steps. Ideally, Baby-Friendly practices will become the norm after staff are trained and policy revisions are made. Staff training and acceptance of breastfeeding as optimal nutrition for infants are the most critical factors that will ultimately drive change for the organization. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NORM (Naturally Occurring Radioactive Material) Waste Policies for the nation's oil and gas producing states have been in existence since the 1980's, when Louisiana was the first state to develop a NORM regulatory program in 1989. Since that time, expectations for NORM Waste Policies have evolved, as Health, Safety, Environment, and Social responsibility (HSE & SR) grows increasingly important to the public. Therefore, the oil and gas industry's safety and environmental performance record will face challenges in the future, about its best practices for managing the co-production of NORM wastes. ^ Within the United States, NORM is not federally regulated. The U.S. EPA claims it regulates NORM under CERCLA (superfund) and the Clean Water Act. Though, there are no universally applicable regulations for radium-based NORM waste. Therefore, individual states have taken responsibility for developing NORM regulatory programs, because of the potential radiological risk it can pose to man (bone and lung cancer) and his environment. This has led to inconsistencies in NORM Waste Policies as well as a NORM management gap in both state and federal regulatory structures. ^ Fourteen different NORM regulations and guidelines were compared between Louisiana and Texas, the nation's top two petroleum producing states. Louisiana is the country's top crude oil producer when production from its Federal offshore waters are included, and fourth in crude oil production, behind Texas, Alaska, and California when Federal offshore areas are excluded. Louisiana produces more petroleum products than any state but Texas. For these reasons, a comparative analysis between Louisiana and Texas was undertaken to identify differences in their NORM regulations and guidelines for managing, handling and disposing NORM wastes. Moreover, this analysis was undertaken because Texas is the most explored and drilled worldwide and yet appears to lag behind its neighboring state in terms of its NORM Waste Policy and developing an industry standard for handling, managing and disposing NORM. As a result of this analysis, fourteen recommendations were identified.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents a robust method for ground plane detection in vision-based systems with a non-stationary camera. The proposed method is based on the reliable estimation of the homography between ground planes in successive images. This homography is computed using a feature matching approach, which in contrast to classical approaches to on-board motion estimation does not require explicit ego-motion calculation. As opposed to it, a novel homography calculation method based on a linear estimation framework is presented. This framework provides predictions of the ground plane transformation matrix that are dynamically updated with new measurements. The method is specially suited for challenging environments, in particular traffic scenarios, in which the information is scarce and the homography computed from the images is usually inaccurate or erroneous. The proposed estimation framework is able to remove erroneous measurements and to correct those that are inaccurate, hence producing a reliable homography estimate at each instant. It is based on the evaluation of the difference between the predicted and the observed transformations, measured according to the spectral norm of the associated matrix of differences. Moreover, an example is provided on how to use the information extracted from ground plane estimation to achieve object detection and tracking. The method has been successfully demonstrated for the detection of moving vehicles in traffic environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work gliadin proteins were used to analyse the genetic variability in a sample of the durum wheat Spanish collection conserved at the CRF-INIA. In total 38 different alleles were identified at the loci Gli-A1, Gli-A3, Gli-B5, Gli-B1, Gli-A2 and Gli-B2. All the gliadin loci were polymorphic, possessed large genetic diversity and small and large differentiation within and between varieties, respectively. The Gli-A2 and Gli-B2 loci were the most polymorphic, the most fixed within varieties and the most useful to distinguish among varieties. Alternatively, Gli-B1 locus presented the least genetic variability out of the four main loci Gli-A1, Gli-B1, Gli-A2 and Gli-B2. The Gli-B1 alleles coding for the gliadin γ-45, associated with good quality, had an accumulated frequency of 69.7%, showing that the Spanish germplasm could be a good source for breeding quality. The Spanish landraces studied showed new gliadin alleles not catalogued so far. These new alleles might be associated with specific Spanish environment factors. The large number of new alleles identified also indicates that durum wheat Spanish germplasm is rather unique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the problem of efficiently tracking 3D objects in sequences of images. We tackle the efficient 3D tracking problem by using direct image registration. This problem is posed as an iterative optimization procedure that minimizes a brightness error norm. We review the most popular iterative methods for image registration in the literature, turning our attention to those algorithms that use efficient optimization techniques. Two forms of efficient registration algorithms are investigated. The first type comprises the additive registration algorithms: these algorithms incrementally compute the motion parameters by linearly approximating the brightness error function. We centre our attention on Hager and Belhumeur’s factorization-based algorithm for image registration. We propose a fundamental requirement that factorization-based algorithms must satisfy to guarantee good convergence, and introduce a systematic procedure that automatically computes the factorization. Finally, we also bring out two warp functions to register rigid and nonrigid 3D targets that satisfy the requirement. The second type comprises the compositional registration algorithms, where the brightness function error is written by using function composition. We study the current approaches to compositional image alignment, and we emphasize the importance of the Inverse Compositional method, which is known to be the most efficient image registration algorithm. We introduce a new algorithm, the Efficient Forward Compositional image registration: this algorithm avoids the necessity of inverting the warping function, and provides a new interpretation of the working mechanisms of the inverse compositional alignment. By using this information, we propose two fundamental requirements that guarantee the convergence of compositional image registration methods. Finally, we support our claims by using extensive experimental testing with synthetic and real-world data. We propose a distinction between image registration and tracking when using efficient algorithms. We show that, depending whether the fundamental requirements are hold, some efficient algorithms are eligible for image registration but not for tracking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel framework for encoding latency analysis of arbitrary multiview video coding prediction structures. This framework avoids the need to consider an specific encoder architecture for encoding latency analysis by assuming an unlimited processing capacity on the multiview encoder. Under this assumption, only the influence of the prediction structure and the processing times have to be considered, and the encoding latency is solved systematically by means of a graph model. The results obtained with this model are valid for a multiview encoder with sufficient processing capacity and serve as a lower bound otherwise. Furthermore, with the objective of low latency encoder design with low penalty on rate-distortion performance, the graph model allows us to identify the prediction relationships that add higher encoding latency to the encoder. Experimental results for JMVM prediction structures illustrate how low latency prediction structures with a low rate-distortion penalty can be derived in a systematic manner using the new model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ATM, SDH or satellite have been used in the last century as the contribution network of Broadcasters. However the attractive price of IP networks is changing the infrastructure of these networks in the last decade. Nowadays, IP networks are widely used, but their characteristics do not offer the level of performance required to carry high quality video under certain circumstances. Data transmission is always subject to errors on line. In the case of streaming, correction is attempted at destination, while on transfer of files, retransmissions of information are conducted and a reliable copy of the file is obtained. In the latter case, reception time is penalized because of the low priority this type of traffic on the networks usually has. While in streaming, image quality is adapted to line speed, and line errors result in a decrease of quality at destination, in the file copy the difference between coding speed vs line speed and errors in transmission are reflected in an increase of transmission time. The way news or audiovisual programs are transferred from a remote office to the production centre depends on the time window and the type of line available; in many cases, it must be done in real time (streaming), with the resulting image degradation. The main purpose of this work is the workflow optimization and the image quality maximization, for that reason a transmission model for multimedia files adapted to JPEG2000, is described based on the combination of advantages of file transmission and those of streaming transmission, putting aside the disadvantages that these models have. The method is based on two patents and consists of the safe transfer of the headers and data considered to be vital for reproduction. Aside, the rest of the data is sent by streaming, being able to carry out recuperation operations and error concealment. Using this model, image quality is maximized according to the time window. In this paper, we will first give a briefest overview of the broadcasters requirements and the solutions with IP networks. We will then focus on a different solution for video file transfer. We will take the example of a broadcast center with mobile units (unidirectional video link) and regional headends (bidirectional link), and we will also present a video file transfer file method that satisfies the broadcaster requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of automatic pathological voice detection systems is to serve as tools, to medical specialists, for a more objective, less invasive and improved diagnosis of diseases. In this respect, the gold standard for those system include the usage of a optimized representation of the spectral envelope, either based on cepstral coefficients from the mel-scaled Fourier spectral envelope (Mel-Frequency Cepstral Coefficients) or from an all-pole estimation (Linear Prediction Coding Cepstral Coefficients) forcharacterization, and Gaussian Mixture Models for posterior classification. However, the study of recently proposed GMM-based classifiers as well as Nuisance mitigation techniques, such as those employed in speaker recognition, has not been widely considered inpathology detection labours. The present work aims at testing whether or not the employment of such speaker recognition tools might contribute to improve system performance in pathology detection systems, specifically in the automatic detection of Obstructive Sleep Apnea. The testing procedure employs an Obstructive Sleep Apnea database, in conjunction with GMM-based classifiers looking for a better performance. The results show that an improved performance might be obtained by using such approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tesis presenta un novedoso marco de referencia para el análisis y optimización del retardo de codificación y descodificación para vídeo multivista. El objetivo de este marco de referencia es proporcionar una metodología sistemática para el análisis del retardo en codificadores y descodificadores multivista y herramientas útiles en el diseño de codificadores/descodificadores para aplicaciones con requisitos de bajo retardo. El marco de referencia propuesto caracteriza primero los elementos que tienen influencia en el comportamiento del retardo: i) la estructura de predicción multivista, ii) el modelo hardware del codificador/descodificador y iii) los tiempos de proceso de cuadro. En segundo lugar, proporciona algoritmos para el cálculo del retardo de codificación/ descodificación de cualquier estructura arbitraria de predicción multivista. El núcleo de este marco de referencia consiste en una metodología para el análisis del retardo de codificación/descodificación multivista que es independiente de la arquitectura hardware del codificador/descodificador, completada con un conjunto de modelos que particularizan este análisis del retardo con las características de la arquitectura hardware del codificador/descodificador. Entre estos modelos, aquellos basados en teoría de grafos adquieren especial relevancia debido a su capacidad de desacoplar la influencia de los diferentes elementos en el comportamiento del retardo en el codificador/ descodificador, mediante una abstracción de su capacidad de proceso. Para revelar las posibles aplicaciones de este marco de referencia, esta tesis presenta algunos ejemplos de su utilización en problemas de diseño que afectan a codificadores y descodificadores multivista. Este escenario de aplicación cubre los siguientes casos: estrategias para el diseño de estructuras de predicción que tengan en consideración requisitos de retardo además del comportamiento tasa-distorsión; diseño del número de procesadores y análisis de los requisitos de velocidad de proceso en codificadores/ descodificadores multivista dado un retardo objetivo; y el análisis comparativo del comportamiento del retardo en codificadores multivista con diferentes capacidades de proceso e implementaciones hardware. ABSTRACT This thesis presents a novel framework for the analysis and optimization of the encoding and decoding delay for multiview video. The objective of this framework is to provide a systematic methodology for the analysis of the delay in multiview encoders and decoders and useful tools in the design of multiview encoders/decoders for applications with low delay requirements. The proposed framework characterizes firstly the elements that have an influence in the delay performance: i) the multiview prediction structure ii) the hardware model of the encoder/decoder and iii) frame processing times. Secondly, it provides algorithms for the computation of the encoding/decoding delay of any arbitrary multiview prediction structure. The core of this framework consists in a methodology for the analysis of the multiview encoding/decoding delay that is independent of the hardware architecture of the encoder/decoder, which is completed with a set of models that particularize this delay analysis with the characteristics of the hardware architecture of the encoder/decoder. Among these models, the ones based in graph theory acquire special relevance due to their capacity to detach the influence of the different elements in the delay performance of the encoder/decoder, by means of an abstraction of its processing capacity. To reveal possible applications of this framework, this thesis presents some examples of its utilization in design problems that affect multiview encoders and decoders. This application scenario covers the following cases: strategies for the design of prediction structures that take into consideration delay requirements in addition to the rate-distortion performance; design of number of processors and analysis of processor speed requirements in multiview encoders/decoders given a target delay; and comparative analysis of the encoding delay performance of multiview encoders with different processing capabilities and hardware implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an adaptive unequal error protection (UEP) strategy built on the 1-D interleaved parity Application Layer Forward Error Correction (AL-FEC) code for protecting the transmission of stereoscopic 3D video content encoded with Multiview Video Coding (MVC) through IP-based networks. Our scheme targets the minimization of quality degradation produced by packet losses during video transmission in time-sensitive application scenarios. To that end, based on a novel packet-level distortion model, it selects in real time the most suitable packets within each Group of Pictures (GOP) to be protected and the most convenient FEC technique parameters, i.e., the size of the FEC generator matrix. In order to make these decisions, it considers the relevance of the packet, the behavior of the channel, and the available bitrate for protection purposes. Simulation results validate both the distortion model introduced to estimate the importance of packets and the optimization of the FEC technique parameter values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of electronic health data favors scientific advance through the creation of repositories for secondary use. Data anonymization is a mandatory step to comply with current legislation. A service for the pseudonymization of electronic healthcare record (EHR) extracts aimed at facilitating the exchange of clinical information for secondary use in compliance with legislation on data protection is presented. According to ISO/TS 25237, pseudonymization is a particular type of anonymization. This tool performs the anonymizations by maintaining three quasi-identifiers (gender, date of birth and place of residence) with a degree of specification selected by the user. The developed system is based on the ISO/EN 13606 norm using its characteristics specifically favorable for anonymization. The service is made up of two independent modules: the demographic server and the pseudonymizing module. The demographic server supports the permanent storage of the demographic entities and the management of the identifiers. The pseudonymizing module anonymizes the ISO/EN 13606 extracts. The pseudonymizing process consists of four phases: the storage of the demographic information included in the extract, the substitution of the identifiers, the elimination of the demographic information of the extract and the elimination of key data in free-text fields. The described pseudonymizing system was used in three Telemedicine research projects with satisfactory results. A problem was detected with the type of data in a demographic data field and a proposal for modification was prepared for the group in charge of the drawing up and revision of the ISO/EN 13606 norm.