978 resultados para Memory-concentration Test
Resumo:
This paper focuses on a novel formalization for assessing the five parameter modeling of a photovoltaic cell. An optimization procedure is used as a feasibility problem to find the parameters tuned at the open circuit, maximum power, and short circuit points in order to assess the data needed for plotting the I-V curve. A comparison with experimental results is presented for two monocrystalline PV modules.
Resumo:
The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.
Resumo:
The usage of COTS-based multicores is becoming widespread in the field of embedded systems. Providing realtime guarantees at design-time is a pre-requisite to deploy real-time systems on these multicores. This necessitates the consideration of the impact of the contention due to shared low-level hardware resources on the Worst-Case Execution Time (WCET) of the tasks. As a step towards this aim, this paper first identifies the different factors that make the WCET analysis a challenging problem in a typical COTS-based multicore system. Then, we propose and prove, a mathematically correct method to determine tight upper bounds on the WCET of the tasks, when they are co-scheduled on different cores.
Resumo:
The current industry trend is towards using Commercially available Off-The-Shelf (COTS) based multicores for developing real time embedded systems, as opposed to the usage of custom-made hardware. In typical implementation of such COTS-based multicores, multiple cores access the main memory via a shared bus. This often leads to contention on this shared channel, which results in an increase of the response time of the tasks. Analyzing this increased response time, considering the contention on the shared bus, is challenging on COTS-based systems mainly because bus arbitration protocols are often undocumented and the exact instants at which the shared bus is accessed by tasks are not explicitly controlled by the operating system scheduler; they are instead a result of cache misses. This paper makes three contributions towards analyzing tasks scheduled on COTS-based multicores. Firstly, we describe a method to model the memory access patterns of a task. Secondly, we apply this model to analyze the worst case response time for a set of tasks. Although the required parameters to obtain the request profile can be obtained by static analysis, we provide an alternative method to experimentally obtain them by using performance monitoring counters (PMCs). We also compare our work against an existing approach and show that our approach outperforms it by providing tighter upper-bound on the number of bus requests generated by a task.
Resumo:
Mestrado em Contabilidade e Análise Financeira
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
Wireless sensor networks (WSNs) emerge as underlying infrastructures for new classes of large-scale networked embedded systems. However, WSNs system designers must fulfill the quality-of-service (QoS) requirements imposed by the applications (and users). Very harsh and dynamic physical environments and extremely limited energy/computing/memory/communication node resources are major obstacles for satisfying QoS metrics such as reliability, timeliness, and system lifetime. The limited communication range of WSN nodes, link asymmetry, and the characteristics of the physical environment lead to a major source of QoS degradation in WSNs-the ldquohidden node problem.rdquo In wireless contention-based medium access control (MAC) protocols, when two nodes that are not visible to each other transmit to a third node that is visible to the former, there will be a collision-called hidden-node or blind collision. This problem greatly impacts network throughput, energy-efficiency and message transfer delays, and the problem dramatically increases with the number of nodes. This paper proposes H-NAMe, a very simple yet extremely efficient hidden-node avoidance mechanism for WSNs. H-NAMe relies on a grouping strategy that splits each cluster of a WSN into disjoint groups of non-hidden nodes that scales to multiple clusters via a cluster grouping strategy that guarantees no interference between overlapping clusters. Importantly, H-NAMe is instantiated in IEEE 802.15.4/ZigBee, which currently are the most widespread communication technologies for WSNs, with only minor add-ons and ensuring backward compatibility with their protocols standards. H-NAMe was implemented and exhaustively tested using an experimental test-bed based on ldquooff-the-shelfrdquo technology, showing that it increases network throughput and transmission success probability up to twice the values obtained without H-NAMe. H-NAMe effectiveness was also demonstrated in a target tracking application with mobile robots - over a WSN deployment.
Resumo:
Contention on the memory bus in COTS based multicore systems is becoming a major determining factor of the execution time of a task. Analyzing this extra execution time is non-trivial because (i) bus arbitration protocols in such systems are often undocumented and (ii) the times when the memory bus is requested to be used are not explicitly controlled by the operating system scheduler; they are instead a result of cache misses. We present a method for finding an upper bound on the extra execution time of a task due to contention on the memory bus in COTS based multicore systems. This method makes no assumptions on the bus arbitration protocol (other than assuming that it is work-conserving).
Resumo:
The foreseen evolution of chip architectures to higher number of, heterogeneous, cores, with non-uniform memory and non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as an alternative to lock-based synchronisation. However, STM relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upperbounded and task sets can be feasibly scheduled. In this paper we defend the role of the transaction contention manager to reduce the number of transaction retries and to help the real-time scheduler assuring schedulability. For such purpose, the contention management policy should be aware of on-line scheduling information.
Resumo:
Portugal has an accentuated aging tendency, presenting an elderly population (individuals with more than 65 years old) of 19.2%. The average life expectancy is 79.2 years. Thus, it’s important to maintain autonomy and independency as long as possible. Functional ability concept rises from the need to evaluate the capacity to conduct daily activities in an independent way. It can be estimated with the 6-minute walk test (6MWT) and other validated test. This test is simple, reliable, valid and consists in a daily activity (walk). The goals of this study was to verify associations between functional capacity measured with two different instruments (6MWT and Composite Physical Function (CPF) scale) and between those results and characterization variables.
Resumo:
Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.
Resumo:
Shape Memory Alloy (SMA) Ni-Ti films have attracted much interest as functional and smart materials due to their unique properties. However, there are still important issues unresolved like formation of film texture and its control as well as substrate effects. Thus, the main challenge is not only the control of the microstructure, including stoichiometry and precipitates, but also the identification and control of the preferential orientation since it is a crucial factor in determining the shape memory behaviour. The aim of this PhD thesis is to study the optimisation of the deposition conditions of films of Ni-Ti in order to obtain the material fully crystallized at the end of the deposition, and to establish a clear relationship between the substrates and texture development. In order to achieve this objective, a two-magnetron sputter deposition chamber has been used allowing to heat and to apply a bias voltage to the substrate. It can be mounted into the six-circle diffractometer of the Rossendorf Beamline (ROBL) at the European Synchrotron Radiation Facility (ESRF), Grenoble, France, enabling an in-situ characterization by X-ray diffraction(XRD) of the films during their growth and annealing. The in-situ studies enable us to identify the different steps of the structural evolution during deposition with a set of parameters as well as to evaluate the effect of changing parameters on the structural characteristics of the deposited film. Besides the in-situ studies, other complementary ex-situ characterization techniques such as XRD at a laboratory source, Rutherford backscattering spectroscopy(RBS), Auger electron spectroscopy (AES), cross-sectional transmission electron microscopy (X-TEM), scanning electron microscopy (SEM), and electrical resistivity (ER) measurements during temperature cycling have been used for a fine structural characterization. In this study, mainly naturally and thermally oxidized Si(100) substrates, TiN buffer layers with different thicknesses (i.e. the TiN topmost layer crystallographic orientation is thickness dependent) and MgO(100) single crystals were used as substrates. The chosen experimental procedure led to a controlled composition and preferential orientation of the films. The type of substrate plays an important role for the texture of the sputtered Ni-Ti films and according to the ER results, the distinct crystallographic orientations of the Ni-Ti films influence their phase transformation characteristics.
Resumo:
OBJECTIVE To evaluate the larvicidal activity of Azadirachta indica, Melaleuca alternifolia, carapa guianensis essential oils and fermented extract of Carica papaya against Aedes aegypti (Linnaeus, 1762) (Diptera: Culicidae). METHODS The larvicide test was performed in triplicate with 300 larvae for each experimental group using the third larval stage, which were exposed for 24h. The groups were: positive control with industrial larvicide (BTI) in concentrations of 0.37 ppm (PC1) and 0.06 ppm (PC2); treated with compounds of essential oils and fermented extract, 50.0% concentration (G1); treated with compounds of essential oils and fermented extract, 25.0% concentration (G2); treated with compounds of essential oils and fermented extract, 12.5% concentration (G3); and negative control group using water (NC1) and using dimethyl (NC2). The larvae were monitored every 60 min using direct visualization. RESULTS No mortality occurred in experimental groups NC1 and NC2 in the 24h exposure period, whereas there was 100% mortality in the PC1 and PC2 groups compared to NC1 and NC2. Mortality rates of 65.0%, 50.0% and 78.0% were observed in the groups G1, G2 and G3 respectively, compared with NC1 and NC2. CONCLUSIONS The association between three essential oils from Azadirachta indica, Melaleuca alternifolia, Carapa guianensis and fermented extract of Carica papaya was efficient at all concentrations. Therefore, it can be used in Aedes aegypti Liverpool third larvae stage control programs.
Resumo:
The ART-WiSe (Architecture for Real-Time communications in Wireless Sensor Networks) framework aims at the design of new communication architectures and mechanisms for time-sensitive Wireless Sensor Networks (WSNs). We adopted a two-tiered architecture where an overlay Wireless Local Area Network (Tier 2) serves as a backbone for a WSN (Tier 1), relying on existing standard communication protocols and commercial-off-the-shell (COTS) technologies – IEEE 802.15.4/ZigBee for Tier 1 and IEEE 802.11 for Tier 2. In this line, a test-bed application is being developed for assessing, validating and demonstrating the ART-WiSe architecture. A pursuit-evasion application was chosen since it fulfils a number of requirements, namely it is feasible and appealing and imposes some stress to the architecture in terms of timeliness. To develop the testbed based on the previously referred technologies, an implementation of the IEEE 8021.5.4/ZigBee protocols is being carried out, since there is no open source available to the community. This paper highlights some relevant aspects of the ART-WiSe architecture, provides some intuition on the protocol stack implementation and presents a general view over the envisaged test-bed application.
Resumo:
This report describes the development of a Test-bed Application for the ART-WiSe Framework with the aim of providing a means of access, validate and demonstrate that architecture. The chosen application is a kind of pursuit-evasion game where a remote controlled robot, navigating through an area covered by wireless sensor network (WSN), is detected and continuously tracked by the WSN. Then a centralized control station takes the appropriate actions for a pursuit robot to chase and “capture” the intruder one. This kind of application imposes stringent timing requirements to the underlying communication infrastructure. It also involves interesting research problems in WSNs like tracking, localization, cooperation between nodes, energy concerns and mobility. Additionally, it can be easily ported into a real-world application. Surveillance or search and rescue operations are two examples where this kind of functionality can be applied. This is still a first approach on the test-bed application and this development effort will be continuously pushed forward until all the envisaged objectives for the Art-WiSe architecture become accomplished.