973 resultados para DEFINITION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multimedia distribution through wireless networks in the home environment presents a number of advantages which have fueled the interest of industry in recent years, such as simple connectivity and data delivery to a variety of devices. Together with High-Definition (HD) contents, multimedia wireless networks have been proposed for several applications, such as IPTV and Digital TV distribution for multiple devices in the home environment. For these scenarios, we propose a multicast distribution system for High-Definition video over 802.11 wireless networks based on rate-limited packet retransmission. We develop a limited rate ARQ system that retransmits packets according to the importance of their content (prioritization scheme) and according to their delay limitations (delay control). The performance of our proposed ARQ system is evaluated and compared with a similarly rate-limited ARQ algorithm. The results show a higher packet recovery rate and improvements in video quality for our proposed system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Métrica de calidad de video de alta definición construida a partir de ratios de referencia completa. La medida de calidad de video, en inglés Visual Quality Assessment (VQA), es uno de los mayores retos por solucionar en el entorno multimedia. La calidad de vídeo tiene un impacto altísimo en la percepción del usuario final (consumidor) de los servicios sustentados en la provisión de contenidos multimedia y, por tanto, factor clave en la valoración del nuevo paradigma denominado Calidad de la Experiencia, en inglés Quality of Experience (QoE). Los modelos de medida de calidad de vídeo se pueden agrupar en varias ramas según la base técnica que sustenta el sistema de medida, destacando en importancia los que emplean modelos psicovisuales orientados a reproducir las características del sistema visual humano, en inglés Human Visual System, del que toman sus siglas HVS, y los que, por el contrario, optan por una aproximación ingenieril en la que el cálculo de calidad está basado en la extracción de parámetros intrínsecos de la imagen y su comparación. A pesar de los avances recogidos en este campo en los últimos años, la investigación en métricas de calidad de vídeo, tanto en presencia de referencia (los modelos denominados de referencia completa), como en presencia de parte de ella (modelos de referencia reducida) e incluso los que trabajan en ausencia de la misma (denominados sin referencia), tiene un amplio camino de mejora y objetivos por alcanzar. Dentro de ellos, la medida de señales de alta definición, especialmente las utilizadas en las primeras etapas de la cadena de valor que son de muy alta calidad, son de especial interés por su influencia en la calidad final del servicio y no existen modelos fiables de medida en la actualidad. Esta tesis doctoral presenta un modelo de medida de calidad de referencia completa que hemos llamado PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), basado en la ponderación de cuatro ratios de calidad calculados a partir de características intrínsecas de la imagen. Son: El Ratio de Fidelidad, calculado mediante el gradiente morfológico o gradiente de Beucher. El Ratio de Similitud Visual, calculado mediante los puntos visualmente significativos de la imagen a través de filtrados locales de contraste. El Ratio de Nitidez, que procede de la extracción del estadístico de textura de Haralick contraste. El Ratio de Complejidad, obtenido de la definición de homogeneidad del conjunto de estadísticos de textura de Haralick PARMENIA presenta como novedad la utilización de la morfología matemática y estadísticos de Haralick como base de una métrica de medida de calidad, pues esas técnicas han estado tradicionalmente más ligadas a la teledetección y la segmentación de objetos. Además, la aproximación de la métrica como un conjunto ponderado de ratios es igualmente novedosa debido a que se alimenta de modelos de similitud estructural y otros más clásicos, basados en la perceptibilidad del error generado por la degradación de la señal asociada a la compresión. PARMENIA presenta resultados con una altísima correlación con las valoraciones MOS procedentes de las pruebas subjetivas a usuarios que se han realizado para la validación de la misma. El corpus de trabajo seleccionado procede de conjuntos de secuencias validados internacionalmente, de modo que los resultados aportados sean de la máxima calidad y el máximo rigor posible. La metodología de trabajo seguida ha consistido en la generación de un conjunto de secuencias de prueba de distintas calidades a través de la codificación con distintos escalones de cuantificación, la obtención de las valoraciones subjetivas de las mismas a través de pruebas subjetivas de calidad (basadas en la recomendación de la Unión Internacional de Telecomunicaciones BT.500), y la validación mediante el cálculo de la correlación de PARMENIA con estos valores subjetivos, cuantificada a través del coeficiente de correlación de Pearson. Una vez realizada la validación de los ratios y optimizada su influencia en la medida final y su alta correlación con la percepción, se ha realizado una segunda revisión sobre secuencias del hdtv test dataset 1 del Grupo de Expertos de Calidad de Vídeo (VQEG, Video Quality Expert Group) mostrando los resultados obtenidos sus claras ventajas. Abstract Visual Quality Assessment has been so far one of the most intriguing challenges on the media environment. Progressive evolution towards higher resolutions while increasing the quality needed (e.g. high definition and better image quality) aims to redefine models for quality measuring. Given the growing interest in multimedia services delivery, perceptual quality measurement has become a very active area of research. First, in this work, a classification of objective video quality metrics based on their underlying methodologies and approaches for measuring video quality has been introduced to sum up the state of the art. Then, this doctoral thesis describes an enhanced solution for full reference objective quality measurement based on mathematical morphology, texture features and visual similarity information that provides a normalized metric that we have called PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), with a high correlated MOS score. The PARMENIA metric is based on the pooling of different quality ratios that are obtained from three different approaches: Beucher’s gradient, local contrast filtering, and contrast and homogeneity Haralick’s texture features. The metric performance is excellent, and improves the current state of the art by providing a wide dynamic range that make easier to discriminate between very close quality coded sequences, especially for very high bit rates whose quality, currently, is transparent for quality metrics. PARMENIA introduces a degree of novelty against other working metrics: on the one hand, exploits the structural information variation to build the metric’s kernel, but complements the measure with texture information and a ratio of visual meaningful points that is closer to typical error sensitivity based approaches. We would like to point out that PARMENIA approach is the only metric built upon full reference ratios, and using mathematical morphology and texture features (typically used in segmentation) for quality assessment. On the other hand, it gets results with a wide dynamic range that allows measuring the quality of high definition sequences from bit rates of hundreds of Megabits (Mbps) down to typical distribution rates (5-6 Mbps), even streaming rates (1- 2 Mbps). Thus, a direct correlation between PARMENIA and MOS scores are easily constructed. PARMENIA may further enhance the number of available choices in objective quality measurement, especially for very high quality HD materials. All this results come from validation that has been achieved through internationally validated datasets on which subjective tests based on ITU-T BT.500 methodology have been carried out. Pearson correlation coefficient has been calculated to verify the accuracy of PARMENIA and its reliability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Definition and study of the innovative façade Natura, made of independent pre-vegetated and water storage type panels

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The term "Smart Product" has become commonly used in recent years. This is because there has been an increasing interest in these kinds of products as part of the consumer goods industry, impacting everyday life and industry. Nevertheless, the term "Smart Product" is used with different meanings in different contexts and application domains. The use of the term "Smart Product" with different meanings and underlying semantics can create important misunderstandings and dissent. The aim of this paper is to analyze the different definitions of Smart Product available in the literature, and to explore and analyze their commonalities and differences, in order to provide a consensus definition that satisfies, and can therefore be used by, all parties. To embrace the identified definitions, the concept of "Smart Thing" is introduced. The methodology used was a systematic literature review. The definition is expressed as an ontology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is no doubt that there is no possibility of finding a single reference about domotics in the first half of the 20th century. The best known authors and those who have documented this discipline, set its origin in the 1970’s, when the x-10 technology began to be used, but it was not until 1988 when Larousse Encyclopedia decided to include the definition of "Smart Building". Furthermore, even nowadays, there is not a single definition widely accepted, and for that reason, many other expressions, namely "Intelligent Buildings" "Domotics" "Digital Home" or "Home Automation" have appeared to describe the automated buildings and homes. The lack of a clear definition for "Smart Buildings" causes difficulty not only in the development of a common international framework to develop research in this field, but it also causes insecurity in the potential user of these buildings. That is to say, the user does not know what is offered by this kind of buildings, hindering the dissemination of the culture of building automation in society. Thus, the main purpose of this paper is to propose a definition of the expression “Smart Buildings” that satisfactorily describes the meaning of this discipline. To achieve this aim, a thorough review of the origin of the term itself and the historical background before the emergence of the phenomenon of domotics was conducted, followed by a critical discussion of existing definitions of the term "Smart Buildings" and other similar terms. The extent of each definition has been analyzed, inaccuracies have been discarded and commonalities have been compared. Throughout the discussion, definitions that bring the term "Smart Buildings" near to disciplines such as computer science, robotics and also telecommunications have been found. However, there are also many other definitions that emphasize in a more abstract way the role of these new buildings in the society and the future of mankind.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental software engineering includes several processes, the most representative being run experiments, run replications and synthesize the results of multiple replications. Of these processes, only the first is relatively well established in software engineering. Problems of information management and communication among researchers are one of the obstacles to progress in the replication and synthesis processes. Software engineering experimentation has expanded considerably over the last few years. This has brought with it the invention of experimental process support proposals. However, few of these proposals provide integral support, including replication and synthesis processes. Most of the proposals focus on experiment execution. This paper proposes an infrastructure providing integral support for the experimental research process, specializing in the replication and synthesis of a family of experiments. The research has been divided into stages or phases, whose transition milestones are marked by the attainment of their goals. Each goal exactly matches an artifact or product. Within each stage, we will adopt cycles of successive approximations (generateand- test cycles), where each approximation includes a diferent viewpoint or input. Each cycle will end with the product approval.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last years, an increasing interest has been developed so as to address the problem of fuel poverty which is already affecting a huge number of European citizens. In 2013, the European Parliament has claimed to the Commission and State Members through several resolutions, the legislative development of policies in order to tackle energy vulnerability of households. In 2000 the UK Government, through the Warm Homes and Energy Conservation Act, established that a person could be regarded as fuel poor if he is a member of a household that cannot get warmth at a reasonable cost. Nevertheless, in order to establish the incidence of fuel poverty among Spanish households, it must be understood which should be the adequate thresholds for indoor temperatures. The research here presented proposes new indoor temperature thresholds for fuel poor households based on adaptive comfort models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Video Quality Assessment needs to correspond to human perception. Pixel-based metrics (PSNR or MSE) fail in many circumstances for not taking into account the spatio-temporal property of human's visual perception. In this paper we propose a new pixel-weighted method to improve video quality metrics for artifacts evaluation. The method applies a psychovisual model based on motion, level of detail, pixel location and the appearance of human faces, which approximate the quality to the human eye's response. Subjective tests were developed to adjust the psychovisual model for demonstrating the noticeable improvement of an algorithm when weighting the pixels according to the factors analyzed instead of treating them equally. The analysis developed demonstrates the necessity of models adapted to the specific visualization of contents and the model presents an advance in quality to be applied over sequences when a determined artifact is analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrogen–deuterium exchange experiments have been used previously to investigate the structures of well defined states of a given protein. These include the native state, the unfolded state, and any intermediates that can be stably populated at equilibrium. More recently, the hydrogen–deuterium exchange technique has been applied in kinetic labeling experiments to probe the structures of transiently formed intermediates on the kinetic folding pathway of a given protein. From these equilibrium and nonequilibrium studies, protection factors are usually obtained. These protection factors are defined as the ratio of the rate of exchange of a given backbone amide when it is in a fully solvent-exposed state (usually obtained from model peptides) to the rate of exchange of that amide in some state of the protein or in some intermediate on the folding pathway of the protein. This definition is straightforward for the case of equilibrium studies; however, it is less clear-cut for the case of transient kinetic intermediates. To clarify the concept for the case of burst-phase intermediates, we have introduced and mathematically defined two different types of protection factors: one is Pstruc, which is more related to the structure of the intermediate, and the other is Papp, which is more related to the stability of the intermediate. Kinetic hydrogen–deuterium exchange data from disulfide-intact ribonuclease A and from cytochrome c are discussed to explain the use and implications of these two definitions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rheumatoid arthritis (RA) is an autoimmune disease associated with the HLA-DR4 and DR1 alleles. The target autoantigen(s) in RA is unknown, but type II collagen (CII) is a candidate, and the DR4- and DR1-restricted immunodominant T cell epitope in this protein corresponds to amino acids 261–273 (CII 261–273). We have defined MHC and T cell receptor contacts in CII 261–273 and provide strong evidence that this peptide corresponds to the peptide binding specificity previously found for RA-associated DR molecules. Moreover, we demonstrate that HLA-DR4 and human CD4 transgenic mice homozygous for the I-Abβ0 mutation are highly susceptible to collagen-induced arthritis and describe the clinical course and histopathological changes in the affected joints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Normal human luminal and myoepithelial breast cells separately purified from a set of 10 reduction mammoplasties by using a double antibody magnetic affinity cell sorting and Dynabead immunomagnetic technique were used in two-dimensional gel proteome studies. A total of 43,302 proteins were detected across the 20 samples, and a master image for each cell type comprising a total of 1,738 unique proteins was derived. Differential analysis identified 170 proteins that were elevated 2-fold or more between the two breast cell types, and 51 of these were annotated by tandem mass spectrometry. Muscle-specific enzyme isoforms and contractile intermediate filaments including tropomyosin and smooth muscle (SM22) alpha protein were detected in the myoepithelial cells, and a large number of cytokeratin subclasses and isoforms characteristic of luminal cells were detected in this cell type. A further 134 nondifferentially regulated proteins were also annotated from the two breast cell types, making this the most extensive study to date of the protein expression map of the normal human breast and the basis for future studies of purified breast cancer cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inteins are protein-splicing elements, most of which contain conserved sequence blocks that define a family of homing endonucleases. Like group I introns that encode such endonucleases, inteins are mobile genetic elements. Recent crystallography and computer modeling studies suggest that inteins consist of two structural domains that correspond to the endonuclease and the protein-splicing elements. To determine whether the bipartite structure of inteins is mirrored by the functional independence of the protein-splicing domain, the entire endonuclease component was deleted from the Mycobacterium tuberculosis recA intein. Guided by computer modeling studies, and taking advantage of genetic systems designed to monitor intein function, the 440-aa Mtu recA intein was reduced to a functional mini-intein of 137 aa. The accuracy of splicing of several mini-inteins was verified. This work not only substantiates structure predictions for intein function but also supports the hypothesis that, like group I introns, mobile inteins arose by an endonuclease gene invading a sequence encoding a small, functional splicing element.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Escherichia coli MG1655 genome has been completely sequenced. The annotated sequence, biochemical information, and other information were used to reconstruct the E. coli metabolic map. The stoichiometric coefficients for each metabolic enzyme in the E. coli metabolic map were assembled to construct a genome-specific stoichiometric matrix. The E. coli stoichiometric matrix was used to define the system's characteristics and the capabilities of E. coli metabolism. The effects of gene deletions in the central metabolic pathways on the ability of the in silico metabolic network to support growth were assessed, and the in silico predictions were compared with experimental observations. It was shown that based on stoichiometric and capacity constraints the in silico analysis was able to qualitatively predict the growth potential of mutant strains in 86% of the cases examined. Herein, it is demonstrated that the synthesis of in silico metabolic genotypes based on genomic, biochemical, and strain-specific information is possible, and that systems analysis methods are available to analyze and interpret the metabolic phenotype.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent studies have demonstrated the importance of recipient HLA-DRB1 allele disparity in the development of acute graft-versus-host disease (GVHD) after unrelated donor marrow transplantation. The role of HLA-DQB1 allele disparity in this clinical setting is unknown. To elucidate the biological importance of HLA-DQB1, we conducted a retrospective analysis of 449 HLA-A, -B, and -DR serologically matched unrelated donor transplants. Molecular typing of HLA-DRB1 and HLA-DQB1 alleles revealed 335 DRB1 and DQB1 matched pairs; 41 DRB1 matched and DQB1 mismatched pairs; 48 DRB1 mismatched and DQB1 matched pairs; and 25 DRB1 and DQB1 mismatched pairs. The conditional probabilities of grades III-IV acute GVHD were 0.42, 0.61, 0.55, and 0.71, respectively. The relative risk of acute GVHD associated with a single locus HLA-DQB1 mismatch was 1.8 (1.1, 2.7; P = 0.01), and the risk associated with any HLA-DQB1 and/or HLA-DRB1 mismatch was 1.6 (1.2, 2.2; P = 0.003). These results provide evidence that HLA-DQ is a transplant antigen and suggest that evaluation of both HLA-DQB1 and HLA-DRB1 is necessary in selecting potential donors.