1000 resultados para Simulação digital


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this thesis is to explore a different kind of digital content management model and to propose a process in order to manage properly the content on an organization’s website. This process also defines briefly the roles and responsibilities of the different actors implicated. In order to create this process, the thesis has been divided into two parts. First, the theoretical analysis helps to find the two main different content management models, content management adaptation and content management localization model. Each of these models, have been analyzed through a SWOT model in order to identify their particularities and which of them is the best option according to particular organizational objectives. In the empirical part, this thesis has measured the organizational website performance comparing two main data. On one hand, the international website is analyzed in order to identify the results of the content management standardization. On the other hand, content management adaptation, also called content management localization model, is analyzed by looking through the key measure of the Dutch page from the same organization. The resulted output is a process model for localization as well as recommendations on how to proceed when creating a digital content management strategy. However, more research is recommended to provide more comprehensive managerial solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho foi conduzido com o objetivo de implementar um modelo computacional empregando a linguagem de simulação EXTEND TM para: a) simular a dinâmica de atividades de um matadouro-frigorífico de aves; e b) conduzir experimentos de análise de sensibilidade. Para tanto, foi implementado um modelo dinâmico, estocástico e discreto. O sistema real modelado está localizado na região sudoeste do Paraná, onde são abatidas cerca de 500000 aves por dia, empregando três linhas de processamento operando em três turnos diários. Na validação do modelo foram coletados dados relativos a três turnos e comparadas as variáveis de saída obtidas a partir do sistema real e as geradas pelo modelo, as quais foram: i) tempo de processamento; ii) peso vivo total; iii) peso vivo aproveitado; iv) peso de subproduto; v) peso da produção total; vi) peso do frango inteiro; e vii) peso total dos cortes. O modelo implementado demonstrou ser aplicável, uma vez que os erros médios percentuais foram inferiores a 1,13%. O experimento de análise de sensibilidade conduzido, mediante as alterações das velocidades de processamento das linhas em 7000, 8000 e 9000 frangos h-1, apresentou os seguintes valores médios para a variável tempo de processamento: 8,69; 7,86 e 7,86 horas, respectivamente. Além disso, o experimento demonstrou que a velocidade de processamento de 9000 frangos h-1 não implica na direta redução do tempo de processamento, pois a cadência de chegada das cargas pode ter ocasionado períodos de ociosidade do matadouro-frigorífico.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A presença de microrganismos em produtos alimentícios durante produção, armazenamento, transporte e embalagem é inevitável. Sendo assim, conhecer o comportamento do crescimento microbiano é muito importante para a segurança alimentar e estimativa da vida útil. Neste trabalho foi simulada a cadeia de abastecimento de peitos de frangos cru, salgado e cozido durante 20 dias a -18 ± 0,5 ºC (simulação da expedição industrial no Brasil e transporte por navio até a Europa) e, após o descongelamento, 21 dias a 4 ± 0,5 ºC (simulação da vida útil em supermercado). Os produtos foram analisados quanto a Pseudomonas spp., Salmonella spp., Listeria monocytogenes, Staphylococcus spp. e microrganismos viáveis totais (mesófilos e psicrotróficos). As análises de contagem em placas foram seguidas por testes bioquímicos clássicos para a confirmação de colônias típicas. Em nenhuma das amostras foi detectada a presença de Salmonella spp. ou de Listeria monocytogenes. Em termos de contagem de colônias viáveis totais, durante os primeiros 20 dias (a -18 ºC) a presença de microrganismos se manteve estável em baixos níveis de detecção. Depois do descongelamento, curvas de crescimento microbiano foram observadas. De acordo com os parâmetros microbiológicos de segurança, tempos entre 9 e 11 dias a 4 ºC mostraram-se como os limites para garantir a qualidade destes produtos

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis regards exhaustion of copyright’s distribution right in intangible transfers of video games. It analyses whether, under the current law of the European Union, the phenomenon of digital exhaustion, especially in relation to games exists. The thesis analyses the consumers’ position in the market for copyright protected goods. It uses video games market as an example of the wider phenomenon of the effect of latest technological developments on consumers. The research conducted for the thesis is mostly legal dogmatic, although also comparative analysis, law and economics and law and technology methods are utilised. The thesis evaluates the effects of the most recent case law of the European Court of Justice to analyse the current state of digital exhaustion. In the analysis of effects that the existence of digital exhaustion has, the thesis uses the consumers’ point of view. The thesis introduces the current state of technology in the field of video games from a legal perspective. Furthermore the thesis analyses the effects on consumers of a scenario that no digital exhaustion exists in the future. Such scenario under the recent European case law at the moment seems realistic. The conclusion of my research is most importantly that the consumer position in the market for digital goods has deteriorated and that the probable exclusion of the exhaustion for digital goods is another piece of evidence of this development. Most importantly however, the state of affairs where no certainty prevails on whether digital exhaustion exists, creates injustice from the consumers’ point of view. Accordingly, acts by EU legislators of the Court of Justice of the European Union are required to clarify the issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUÇÃO: Um dos maiores desafios no manejo da hipertensão arterial é o adequado controle pressórico. Para se alcançar esse objetivo tem se difundido a medida residencial da pressão arterial (MRPA) com aparelhos automáticos. Entretanto, parte da comunidade médico-científica ainda discute sua validade, acreditando que as medidas pressóricas domiciliares podem ser incorretas. OBJETIVO: Avaliar a correspondência entre as medidas simultâneas da pressão arterial (PA) pelo método auscultatório convencional e método digital automático, habitualmente utilizado na MRPA. MÉTODOS: Através de uma conexão em "Y" acoplamos um manguito a um aparelho digital automático validado (ONROM 705IT) e a um esfigmomanômetro de coluna de mercúrio, permitindo aferir simultaneamente a PA pelos dois métodos. Determinamos a PA em 423 indivíduos (normotensos e hipertensos), adequando o tamanho do manguito à circunferência braquial. RESULTADOS: Os valores representam média ± desvio padrão (DP) (valores mínimo-máximo): Idade 40,8 ± 16,3 anos (18-92), circunferência braquial 28,2 ± 3,7 cm (19-42), PA sistólica (PAS) auscultatório 127,6 ± 22,8 mmHg (69-223), PAS automático 129,5 ± 23,0 mmHg (56-226), PA diastólica (PAD) auscultatório 79,5 ± 12,6 mmHg (49-135), PAD automático 79,0 ± 12,6 mmHg (48-123). A diferença média da PAS entre os dois métodos foi de 1,9 mmHg (-15 a +19) e a diferença da PAD de 0,5 mmHg (-19 a +13). Os índices de correlação de Pearson entre os métodos são para a PAS (r = 0,97), e PAD (r = 0,91). A análise de Bland-Altman mostrou concordância clinicamente aceitável entre os métodos. CONCLUSÃO: A PA aferida pelo método digital automático apresenta boa concordância com o método auscultatório convencional, devendo ser usada no auxílio do diagnóstico e controle da hipertensão arterial (HA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

User experience is a crucial element in interactive storytelling, and as such it is important to recognize the different aspects of a positive user experience in an interactive story. Towards that goal, in the first half of this thesis, we will go through the different elements that make up the user experience, with a strong focus on agency. Agency can be understood as the user’s ability to affect the story or the world in which the story is told with interesting and satisfying choices. The freedoms granted by agency are not completely compatible with traditional storytelling, and as such we will also go through some of the issues of agency-centric design philosophies and explore alternate schools of thought. The core purpose of this thesis is to determine the most important aspects of agency with regards to a positive user experience and attempt to find ways for authors to improve the overall quality of user experience in interactive stories. The latter half of this thesis deals with the research conducted on this matter. This research was carried out by analyzing data from an online survey coupled with data gathered by the interactive storytelling system specifically made for this research (Regicide). The most important aspects of this research deal with influencing perceived agency and facilitating an illusion of agency in different ways, and comparing user experiences in these different test environments. The most important findings based on this research include the importance of context-controlled and focused agency and settings in which the agency takes place and the importance of ensuring user-competency within an interactive storytelling system. Another essential conclusion to this research boils down to communication between the user and the system; the goal of influencing perceived agency should primarily be to ensure that the user is aware of all the theoretical agency they possess.