948 resultados para Eclectic paradigm
Resumo:
Demands for functionality enhancements, cost reductions and power savings clearly suggest the introduction of multiand many-core platforms in real-time embedded systems. However, when compared to uni-core platforms, the manycores experience additional problems, namely the lack of scalable coherence mechanisms and the necessity to perform migrations. These problems have to be addressed before such systems can be considered for integration into the realtime embedded domain. We have devised several agreement protocols which solve some of the aforementioned issues. The protocols allow the applications to plan and organise their future executions both temporally and spatially (i.e. when and where the next job will be executed). Decisions can be driven by several factors, e.g. load balancing, energy savings and thermal issues. All presented protocols are analytically described, with the particular emphasis on their respective real-time behaviours and worst-case performance. The underlying assumptions are based on the multi-kernel model and the message-passing paradigm, which constitutes the communication between the interacting instances.
Resumo:
This paper presents part of a study that aimed to understand how the emergence of algebraic thinking takes place in a group of four-year-old children, as well as its relationship to the exploration of children‘s literature. To further deepen and guide this study the following research questions were formulated: (1) How can children's literature help preschoolers identify patterns?; (2) What strategies and thinking processes do children use to create, analyze and generalize repeating and growing patterns?; (3) What strategies do children use to identify the unit of repeat of a pattern? and (4) What factors influence the identification of patterns? The paper focuses only on the strategies and thinking processes that children use to create, analyze and generalize repeating patterns. The present study was developed with a group of 14 preschoolers in a private school in Lisbon, and it was carried out with all children. In order to develop the research, a qualitative research methodology under the interpretive paradigm was chosen, emphasizing meanings and processes. The researcher took the dual role of teacher-researcher, conducting the study with her own group and in her own natural environment. Participant observation and document analysis (audio and video recordings, photos and children productions) were used as data collection methods. Data collection took place from October 2013 to April 2014. The results of the study indicate that children master the concept of repeating patterns, and they are able to identify the unit of repeat, create and analyze various repeating patterns, evolving from simpler to more complex forms.
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
When the Internet was born, the purpose was to interconnect computers to share digital data at large-scale. On the other hand, when embedded systems were born, the objective was to control system components under real-time constraints through sensing devices, typically at small to medium scales. With the great evolution of the Information and Communication Technology (ICT), the tendency is to enable ubiquitous and pervasive computing to control everything (physical processes and physical objects) anytime and at a large-scale. This new vision gave recently rise to the paradigm of Cyber-Physical Systems (CPS). In this position paper, we provide a realistic vision to the concept of the Cyber-Physical Internet (CPI), discuss its design requirements and present the limitations of the current networking abstractions to fulfill these requirements. We also debate whether it is more productive to adopt a system integration approach or a radical design approach for building large-scale CPS. Finally, we present a sample of realtime challenges that must be considered in the design of the Cyber-Physical Internet.
Resumo:
Wireless sensor networks (WSNs) are one of today’s most prominent instantiations of the ubiquituous computing paradigm. In order to achieve high levels of integration, WSNs need to be conceived considering requirements beyond the mere system’s functionality. While Quality-of-Service (QoS) is traditionally associated with bit/data rate, network throughput, message delay and bit/packet error rate, we believe that this concept is too strict, in the sense that these properties alone do not reflect the overall quality-ofservice provided to the user/application. Other non-functional properties such as scalability, security or energy sustainability must also be considered in the system design. This paper identifies the most important non-functional properties that affect the overall quality of the service provided to the users, outlining their relevance, state-of-the-art and future research directions.
Resumo:
Cooperating objects (COs) is a recently coined term used to signify the convergence of classical embedded computer systems, wireless sensor networks and robotics and control. We present essential elements of a reference architecture for scalable data processing for the CO paradigm.
Resumo:
The paper will present the central discourse of the knowledge-based society. Already in the 1960s the debate of the industrial society already raised the question whether there can be considered a paradigm shift towards a knowledge-based society. Some prominent authors already foreseen ‘knowledge’ as the main indicator in order to displace ‘labour’ and ‘capital’ as the main driving forces of the capitalistic development. Today on the political level and also in many scientific disciplines the assumption that we are already living in a knowledge-based society seems obvious. Although we still do not have a theory of the knowledge-based society and there still exist a methodological gap about the empirical indicators, the vision of a knowledge-based society determines at least the perception of the Western societies. In a first step the author will pinpoint the assumptions about the knowledge-based society on three levels: on the societal, on the organisational and on the individual level. These assumptions are relied on the following topics: a) The role of the information and communication technologies; b) The dynamic development of globalisation as an ‘evolutionary’ process; c) The increasing importance of knowledge management within organisations; d) The changing role of the state within the economic processes. Not only the differentiation between the levels but also the revision of the assumptions of a knowledge-based society will show that the ‘topics raised in the debates’ cannot be considered as the results of a profound societal paradigm shift. However what seems very impressive is the normative and virtual shift towards a concept of modernity, which strongly focuses on the role of technology as a driving force as well as on the global economic markets, which has to be accepted. Therefore – according to the official debate - the successful adaptation of these processes seems the only way to meet the knowledge-based society. Analysing the societal changes on the three levels, the label ‘knowledge-based society’ can be seen critically. Therefore the main question of Theodor W. Adorno during the 16th Congress of Sociology in 1968 did not loose its actuality. Facing the societal changes he asked whether we are still living in the industrial society or already in a post-industrial state. Thinking about the knowledge-based society according to these two options, this exercise would enrich the whole debate in terms of social inequality, political, economic exclusion processes and at least the power relationship between social groups.
Resumo:
Wireless Sensor Networks (WSNs) are highly distributed systems in which resource allocation (bandwidth, memory) must be performed efficiently to provide a minimum acceptable Quality of Service (QoS) to the regions where critical events occur. In fact, if resources are statically assigned independently from the location and instant of the events, these resources will definitely be misused. In other words, it is more efficient to dynamically grant more resources to sensor nodes affected by critical events, thus providing better network resource management and reducing endto- end delays of event notification and tracking. In this paper, we discuss the use of a WSN management architecture based on the active network management paradigm to provide the real-time tracking and reporting of dynamic events while ensuring efficient resource utilization. The active network management paradigm allows packets to transport not only data, but also program scripts that will be executed in the nodes to dynamically modify the operation of the network. This presumes the use of a runtime execution environment (middleware) in each node to interpret the script. We consider hierarchical (e.g. cluster-tree, two-tiered architecture) WSN topologies since they have been used to improve the timing performance of WSNs as they support deterministic medium access control protocols.
Resumo:
The virtual enterprise paradigm seems a fit response to face market instability and the volatile nature of business opportunities increasing enterprise’s interest in similar forms of networked organisations. The dynamic environment of a virtual enterprise requires that partners in the consortium own reconfigurable shop floors. This paper presents new approaches to shop floor control that meet the requirements of the new industrial paradigms and argues on work re-organization at shop floor level.
Resumo:
Mestrado em Educação Especial: Multideficiência e Problemas de Cognição
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Educação Matemática na Educação Pré- Escolar e nos 1º e 2º Ciclo do Ensino Básico
Resumo:
Relatório Final de Estágio apresentado à Escola Superior de Dança, com vista à obtenção do grau de Mestre em Ensino de Dança.
Resumo:
The growing heterogeneity of networks, devices and consumption conditions asks for flexible and adaptive video coding solutions. The compression power of the HEVC standard and the benefits of the distributed video coding paradigm allow designing novel scalable coding solutions with improved error robustness and low encoding complexity while still achieving competitive compression efficiency. In this context, this paper proposes a novel scalable video coding scheme using a HEVC Intra compliant base layer and a distributed coding approach in the enhancement layers (EL). This design inherits the HEVC compression efficiency while providing low encoding complexity at the enhancement layers. The temporal correlation is exploited at the decoder to create the EL side information (SI) residue, an estimation of the original residue. The EL encoder sends only the data that cannot be inferred at the decoder, thus exploiting the correlation between the original and SI residues; however, this correlation must be characterized with an accurate correlation model to obtain coding efficiency improvements. Therefore, this paper proposes a correlation modeling solution to be used at both encoder and decoder, without requiring a feedback channel. Experiments results confirm that the proposed scalable coding scheme has lower encoding complexity and provides BD-Rate savings up to 3.43% in comparison with the HEVC Intra scalable extension under development. © 2014 IEEE.
Resumo:
Emergent architectures and paradigms targeting reconfigurable manufacturing systems increasingly rely on intelligent modules to maximize the robustness and responsiveness of modern installations. Although intelligent behaviour significantly minimizes the occurrence of faults and breakdowns it does not exclude them nor can prevent equipment’s normal wear. Adequate maintenance is fundamental to extend equipments’ life cycle. It is of major importance the ability of each intelligent device to take an active role in maintenance support. Further this paradigm shift towards “embedded intelligence”, supported by cross platform technologies, induces relevant organizational and functional changes on local maintenance teams. On the one hand, the possibility of outsourcing maintenance activities, with the warranty of a timely response, through the use of pervasive networking technologies and, on the other hand, the optimization of local maintenance staff are some examples of how IT is changing the scenario in maintenance. The concept of e-maintenance is, in this context, emerging as a new discipline with defined socio-economic challenges. This paper proposes a high level maintenance architecture supporting maintenance teams’ management and offering contextualized operational support. All the functionalities hosted by the architecture are offered to the remaining system as network services. Any intelligent module, implementing the services’ interface, can report diagnostic, prognostic and maintenance recommendations that enable the core of the platform to decide on the best course of action.
Resumo:
Mestrado em Segurança e Higiene no Trabalho