981 resultados para Digital distribution right
Resumo:
Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.
Resumo:
BACKGROUND: Social networks are common in digital health. A new stream of research is beginning to investigate the mechanisms of digital health social networks (DHSNs), how they are structured, how they function, and how their growth can be nurtured and managed. DHSNs increase in value when additional content is added, and the structure of networks may resemble the characteristics of power laws. Power laws are contrary to traditional Gaussian averages in that they demonstrate correlated phenomena. OBJECTIVES: The objective of this study is to investigate whether the distribution frequency in four DHSNs can be characterized as following a power law. A second objective is to describe the method used to determine the comparison. METHODS: Data from four DHSNs—Alcohol Help Center (AHC), Depression Center (DC), Panic Center (PC), and Stop Smoking Center (SSC)—were compared to power law distributions. To assist future researchers and managers, the 5-step methodology used to analyze and compare datasets is described. RESULTS: All four DHSNs were found to have right-skewed distributions, indicating the data were not normally distributed. When power trend lines were added to each frequency distribution, R(2) values indicated that, to a very high degree, the variance in post frequencies can be explained by actor rank (AHC .962, DC .975, PC .969, SSC .95). Spearman correlations provided further indication of the strength and statistical significance of the relationship (AHC .987. DC .967, PC .983, SSC .993, P<.001). CONCLUSIONS: This is the first study to investigate power distributions across multiple DHSNs, each addressing a unique condition. Results indicate that despite vast differences in theme, content, and length of existence, DHSNs follow properties of power laws. The structure of DHSNs is important as it gives insight to researchers and managers into the nature and mechanisms of network functionality. The 5-step process undertaken to compare actor contribution patterns can be replicated in networks that are managed by other organizations, and we conjecture that patterns observed in this study could be found in other DHSNs. Future research should analyze network growth over time and examine the characteristics and survival rates of superusers.
Resumo:
The classification of galaxies as star forming or active is generally done in the ([O III]/H beta, [N II]/H alpha) plane. The Sloan Digital Sky Survey (SDSS) has revealed that, in this plane, the distribution of galaxies looks like the two wings of a seagull. Galaxies in the right wing are referred to as Seyfert/LINERs, leading to the idea that non-stellar activity in galaxies is a very common phenomenon. Here, we argue that a large fraction of the systems in the right wing could actually be galaxies which stopped forming stars. The ionization in these `retired` galaxies would be produced by hot post-asymptotic giant branch stars and white dwarfs. Our argumentation is based on a stellar population analysis of the galaxies via our STARLIGHT code and on photoionization models using the Lyman continuum radiation predicted for this population. The proportion of LINER galaxies that can be explained in such a way is, however, uncertain. We further show how observational selection effects account for the shape of the right wing. Our study suggests that nuclear activity may not be as common as thought. If retired galaxies do explain a large part of the seagull`s right wing, some of the work concerning nuclear activity in galaxies, as inferred from SDSS data, will have to be revised.
Resumo:
We present the results of searches for dipolar-type anisotropies in different energy ranges above 2.5 x 10(17) eV with the surface detector array of the Pierre Auger Observatory, reporting on both the phase and the amplitude measurements of the first harmonic modulation in the right-ascension distribution. Upper limits on the amplitudes are obtained, which provide the most stringent bounds at present, being below 2% at 99% C.L. for EeV energies. We also compare our results to those of previous experiments as well as with some theoretical expectations. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This thesis is about new digital moving image recording technologies and how they augment the distribution of creativity and the flexibility in moving image production systems, but also impose constraints on how images flow through the production system. The central concept developed in this thesis is ‘creative space’ which links quality and efficiency in moving image production to time for creative work, capacity of digital tools, user skills and the constitution of digital moving image material. The empirical evidence of this thesis is primarily based on semi-structured interviews conducted with Swedish film and TV production representatives.This thesis highlights the importance of pre-production technical planning and proposes a design management support tool (MI-FLOW) as a way to leverage functional workflows that is a prerequisite for efficient and cost effective moving image production.
Resumo:
Nowadays the real contribution of light on the acceleration of the chemical reaction for the dental bleaching is under incredulity, mostly because the real mechanisms of its contribution still are obscure. Objectives: Determine the influence of pigment of three colored bleaching gels in the light distribution and absorption in the teeth, to accomplish that, we have used in this experiment bovine teeth and three colored bleaching gels. It is well Known that the dark molecules absorb light and increase the local temperature upraising the bleaching rate, these molecules are located in the interface between the enamel and dentin. Methods: This study was realized using an argon laser with 455nm with 150mW of intensity and a LED with the same characteristics, three colored gels (green, blue and red) and to realize the capture of the digital images it was used a CCD camera connected to a PC. The images were processed in a mathematical environment (MATHLAB, R12 (R)). Results: The obtained results show that the color of the bleaching gel influences significantly the absorption of light in the specific sites of the teeth. Conclusions: This poor absorption can be one of the major factors involved with the incredulity of the light contribution on the process that can be observed in the literature nowadays.
Resumo:
It is presented two study cases about the approach in root analysis at field and laboratory conditions based on digital image analysis. Grapevine (Vitis vinifera L.) and date palm (Phoenix dactylifera L.) root systems were analyzed by both the monolith and trench wall method aided by digital image analysis. Correlation between root parameters and their fractional distribution over the soil profile were obtained, as well as the root diameter estimation. Results have shown the feasibility of digital image analysis for evaluation of root distribution.
Resumo:
This paper presents a distribution feeder simulation using VHDL-AMS, considering the standard IEEE 13 node test feeder admitted as an example. In an electronic spreadsheet all calculations are performed in order to develop the modeling in VHDL-AMS. The simulation results are compared in relation to the results from the well knowing MatLab/Simulink environment, in order to verify the feasibility of the VHDL-AMS modeling for a standard electrical distribution feeder, using the software SystemVision™. This paper aims to present the first major developments for a future Real-Time Digital Simulator applied to Electrical Power Distribution Systems. © 2012 IEEE.
Resumo:
Pós-graduação em Televisão Digital: Informação e Conhecimento - FAAC
Resumo:
North Pacific right whales (Eubalaena japonica) were extensively exploited in the 19th century, and their recovery was further retarded (severely so in the eastern population) by illegal Soviet catches in the 20th century, primarily in the 1960s. Monthly plots of right whale sightings and catches from both the 19th and 20th centuries are provided, using data summarized by Scarff (1991, from the whale charts of Matthew Fontaine Maury) and Brownell et al. (2001), respectively. Right whales had an extensive offshore distribution in the 19th century, and were common in areas (such as the Gulf of Alaska and Sea of Japan) where few or no right whales occur today. Seasonal movements of right whales are apparent in the data, although to some extent these reflect survey and whaling effort. That said, these seasonal movements indicate a general northward migration in spring from lower latitudes, and major concentrations above 40°N in summer. Sightings diminished and occurred further south in autumn, and few animals were recorded anywhere in winter. These north-south migratory movements support the hypothesis of two largely discrete populations of right whales in the eastern and western North Pacific. Overall, these analyses confirm that the size and range of the right whale population is now considerably diminished in the North Pacific relative to the situation during the peak period of whaling for this species in the 19th century. For management purposes, new surveys are urgently required to establish the present distribution of this species; existing data suggest that the Bering Sea, the Gulf of Alaska, the Okhotsk Sea, the Kuril Islands and the coast of Kamchatka are the areas with the greatest likelihood of finding right whales today.
Resumo:
Aims: We sought to analyse local distribution of aortic annulus and left ventricular outflow tract (LVOT) calcification in patients undergoing transcatheter aortic valve replacement (TAVR) and its impact on aortic regurgitation (AR) immediately after device placement. Methods and results: A group of 177 patients with severe aortic stenosis undergoing multislice computed tomography of the aortic root followed by TAVR were enrolled in this single-centre study. Annular and LVOT calcifications were assessed per cusp using a semi-quantitative grading system (0: none; 1 [mild]: small, non-protruding calcifications; 2 [moderate]: protruding [>1 mm] or extensive [>50% of cusp sector] calcifications; 3 [severe]: protruding and extensive calcifications). Any calcification of the annulus or LVOT was present in 107 (61%) and 63 (36%) patients, respectively. Prevalence of annulus/LVOT calcifications in the left coronary cusp was 42% and 25%, respectively, in the non-coronary cusp 28% and 13%, in the right coronary cusp 13% and 5%. AR grade 2 to 4 assessed by the method of Sellers immediately after TAVR device implantation was observed in 55 patients (31%). Multivariate regression analysis revealed that the overall annulus calcification (OR [95% CI] 1.48 [1.10-2.00]; p=0.0106), the overall LVOT calcification (1.93 [1.26-2.96]; p=0.0026), any moderate or severe LVOT calcification (5.37 [1.52-18.99]; p=0.0092), and asymmetric LVOT calcification were independent predictors of AR. Conclusions: Calcifications of the aortic annulus and LVOT are frequent in patients undergoing TAVR, and both the distribution and the severity of calcifications appear to be independent predictors of aortic regurgitation after device implantation. - See more at: http://www.pcronline.com/eurointervention/77th_issue/126/#sthash.Hzodgju5.dpuf
Resumo:
Vita.
Resumo:
The MESA Puget Sound Project is sponsored by the National Oceanic and Atmospheric Administration and the Environmental Protection Agency.
Resumo:
The international perspectives on these issues are especially valuable in an increasingly connected, but still institutionally and administratively diverse world. The research addressed in several chapters in this volume includes issues around technical standards bodies like EpiDoc and the TEI, engaging with ways these standards are implemented, documented, taught, used in the process of transcribing and annotating texts, and used to generate publications and as the basis for advanced textual or corpus research. Other chapters focus on various aspects of philological research and content creation, including collaborative or community driven efforts, and the issues surrounding editorial oversight, curation, maintenance and sustainability of these resources. Research into the ancient languages and linguistics, in particular Greek, and the language teaching that is a staple of our discipline, are also discussed in several chapters, in particular for ways in which advanced research methods can lead into language technologies and vice versa and ways in which the skills around teaching can be used for public engagement, and vice versa. A common thread through much of the volume is the importance of open access publication or open source development and distribution of texts, materials, tools and standards, both because of the public good provided by such models (circulating materials often already paid for out of the public purse), and the ability to reach non-standard audiences, those who cannot access rich university libraries or afford expensive print volumes. Linked Open Data is another technology that results in wide and free distribution of structured information both within and outside academic circles, and several chapters present academic work that includes ontologies and RDF, either as a direct research output or as essential part of the communication and knowledge representation. Several chapters focus not on the literary and philological side of classics, but on the study of cultural heritage, archaeology, and the material supports on which original textual and artistic material are engraved or otherwise inscribed, addressing both the capture and analysis of artefacts in both 2D and 3D, the representation of data through archaeological standards, and the importance of sharing information and expertise between the several domains both within and without academia that study, record and conserve ancient objects. Almost without exception, the authors reflect on the issues of interdisciplinarity and collaboration, the relationship between their research practice and teaching and/or communication with a wider public, and the importance of the role of the academic researcher in contemporary society and in the context of cutting edge technologies. How research is communicated in a world of instant- access blogging and 140-character micromessaging, and how our expectations of the media affect not only how we publish but how we conduct our research, are questions about which all scholars need to be aware and self-critical.