40 resultados para time study
Resumo:
The objective of this article is to provide additional knowledge to the discussion of long-term memory, leaning over the behavior of the main Portuguese stock index. The first four moments are calculated using time windows of increasing size and sliding time windows of fixed size equal to 50 days and suggest that daily returns are non-ergodic and non-stationary. Seeming that the series is best described by a fractional Brownian motion approach, we use the rescaled-range analysis (R/S) and the detrended fluctuation analysis (DFA). The findings indicate evidence of long term memory in the form of persistence. This evidence of fractal structure suggests that the market is subject to greater predictability and contradicts the efficient market hypothesis in its weak form. This raises issues regarding theoretical modeling of asset pricing. In addition, we carried out a more localized (in time) study to identify the evolution of the degree of long-term dependency over time using windows 200-days and 400-days. The results show a switching feature in the index, from persistent to anti-persistent, quite evident from 2010.
Resumo:
Prepared for presentation at the Portuguese Finance Network International Conference 2014, Vilamoura, Portugal, June 18-20
Resumo:
Hoje em dia, e com os avanços tecnológicos a surgirem de forma constante, existem novas áreas que têm de ser consideradas com um foco importante por parte de todas as organizações, sendo uma delas a robótica industrial. Motivada em aumentar o seu output, as condições de trabalho para os seus colaboradores, bem como todas as condições de organização da logística interna, a Grohe Portugal, mais especificamente o departamento da montagem, achou relevante fazer um estudo de métodos e tempos, calculando os ganhos potenciais de aumento de output com introdução de robótica nas linhas dos cartuchos. Os objetivos principais seriam então, fazer uma restruturação de todo o layout dessas linhas, tendo como foco automatizar alguma(s) operação(ões), conseguindo assim uma melhoria significativa do output dessas linhas com o menor payback possível. Posto isto, esta dissertação pretende apresentar o trabalho desenvolvido junto da Grohe Portugal, que teve como objetivo fazer os estudos e a automatização de linhas de montagem de cartuchos, bem como melhorar algumas linhas de montagem tendo em conta fatores ergonómicos. Relativamente à automatização da linha dos cartuchos, foi importante estudar todos os seguintes aspetos: utilização de unidades robóticas; ergonomia; ganhos de produtividade; automatizar ou semi-automatizar operações; simplificar processos de montagem; simplificar setups; solicitar orçamentos; elaborar caderno de encargos. Para a realização deste projeto, o trabalho desenvolvido foi decomposto em várias etapas, entre as quais se destacam: análise e estudo dos métodos e sequências de montagem; levantamento de todos os componentes e operações de montagem até à obtenção do cartucho final; estudo de tempos de todas essas operações de montagem; caraterização de um novo layout para as linhas com a introdução de unidades robóticas, tanto quanto possível, mais adequadas; elaboração de caderno de encargos para ser enviado a empresas, para estas poderem apresentar uma orçamentação, bem como indicar as unidades robóticas mais adequadas para as tarefas pretendidas; automatização da linha dos cartuchos. Relativamente ao projeto de novas linhas de montagem tendo em conta fatores ergonómicos, de forma a melhorar os sistemas de abastecimento e as condições de trabalho por parte dos operadores, foram postos em prática diferentes passos, nomeadamente: identificação de todos os processos de montagem realizados pelos operadores na linha a melhorar; estudo e definição da disposição dos componentes na nova linha, bem como a sua forma de abastecimento; projeto da nova linha de montagem em 3D com recurso ao software SolidWorks; realização prática da linha, acompanhando e ajudando a equipa da ferramentaria. O balanço final do trabalho foi bastante positivo, tanto na automatização das linhas de montagem dos cartuchos, em que todo o seu estudo foi alcançado com sucesso, como no melhoramento ergonómico das linhas de montagem, tendo-se alcançado melhorias em alguns índices de qualidade, tempos de abastecimento, organização das linhas, e nas condições de iluminação, resultando essas melhorias numa avaliação positiva por parte dos colaboradores que nelas trabalham todos os dias.
Resumo:
Graphics processor units (GPUs) today can be used for computations that go beyond graphics and such use can attain a performance that is orders of magnitude greater than a normal processor. The software executing on a graphics processor is composed of a set of (often thousands of) threads which operate on different parts of the data and thereby jointly compute a result which is delivered to another thread executing on the main processor. Hence the response time of a thread executing on the main processor is dependent on the finishing time of the execution of threads executing on the GPU. Therefore, we present a simple method for calculating an upper bound on the finishing time of threads executing on a GPU, in particular NVIDIA Fermi. Developing such a method is nontrivial because threads executing on a GPU share hardware resources at very fine granularity.
Resumo:
The simulation analysis is important approach to developing and evaluating the systems in terms of development time and cost. This paper demonstrates the application of Time Division Cluster Scheduling (TDCS) tool for the configuration of IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs using the simulation analysis, as an illustrative example that confirms the practical applicability of the tool. The simulation study analyses how the number of retransmissions impacts the reliability of data transmission, the energy consumption of the nodes and the end-to-end communication delay, based on the simulation model that was implemented in the Opnet Modeler. The configuration parameters of the network are obtained directly from the TDCS tool. The simulation results show that the number of retransmissions impacts the reliability, the energy consumption and the end-to-end delay, in a way that improving the one may degrade the others.
Resumo:
Presented at 23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France.
Resumo:
Dissertação para obtenção do Grau de Mestre em Contabilidade e Finanças Orientador: Mestre Paulino Manuel Leite da Silva
Resumo:
As gestantes, fruto das suas alterações fisiológicas e biomecânicas, constituem uma população de risco relativamente a dores ou lesões do sistema músculo-esquelético, nomeadamente, nos membros inferiores e coluna. Os objectivos deste estudo consistiram em avaliar: (i) a dor e o conforto dos pés durante a marcha: sem o uso de qualquer palmilha nas gestantes e no grupo de controlo; com a aplicação de uma palmilha de retropé e com a aplicação de uma palmilha completa (nas gestantes); (ii) a distribuição das pressões plantares e, (iii) as forças de reacção do solo nas mesmas condições experimentais. Avaliámos ainda a duração das diferentes fases do ciclo de marcha nas gestantes, com e sem palmilhas, e no grupo de controlo, sem o uso de palmilha. Os nossos resultados mostraram que: (i) as gestantes demoram mais tempo a completar a fase de apoio da marcha, (ii) têm um aumento significativo de dores nos pés, face ao grupo de controlo, (iii) as gestantes sentem menos dor e mais conforto quando realizam marcha, com palmilhas, especialmente com a palmilha completa, (iv) a palmilha completa redistribui as forças, diminui os valores de pressão e aumenta a área de contacto do pé com o solo. Os nossos resultados sugerem que, o uso da palmilha completa de silicone, durante a marcha, pode ser eficaz na melhoria da sintomatologia dolorosa e no aumento do conforto da grávida.
Resumo:
The use of distributed energy resources, based on natural intermittent power sources, like wind generation, in power systems imposes the development of new adequate operation management and control methodologies. A short-term Energy Resource Management (ERM) methodology performed in two phases is proposed in this paper. The first one addresses the day-ahead ERM scheduling and the second one deals with the five-minute ahead ERM scheduling. The ERM scheduling is a complex optimization problem due to the high quantity of variables and constraints. In this paper the main goal is to minimize the operation costs from the point of view of a virtual power player that manages the network and the existing resources. The optimization problem is solved by a deterministic mixedinteger non-linear programming approach. A case study considering a distribution network with 33 bus, 66 distributed generation, 32 loads with demand response contracts and 7 storage units and 1000 electric vehicles has been implemented in a simulator developed in the field of the presented work, in order to validate the proposed short-term ERM methodology considering the dynamic power system behavior.
Resumo:
This paper proposes an energy resources management methodology based on three distinct time horizons: day-ahead scheduling, hour-ahead scheduling, and real-time scheduling. In each scheduling process it is necessary the update of generation and consumption operation and of the storage and electric vehicles storage status. Besides the new operation condition, it is important more accurate forecast values of wind generation and of consumption using results of in short-term and very short-term methods. A case study considering a distribution network with intensive use of distributed generation and electric vehicles is presented.
Resumo:
In this paper we present VERITAS, a tool that focus time maintenance, that is one of the most important processes in the engineering of the time during the development of KBS. The verification and validation (V&V) process is part of a wider process denominated knowledge maintenance, in which an enterprise systematically gathers, organizes, shares, and analyzes knowledge to accomplish its goals and mission. The V&V process states if the software requirements specifications have been correctly and completely fulfilled. The methodologies proposed in software engineering have showed to be inadequate for Knowledge Based Systems (KBS) validation and verification, since KBS present some particular characteristics. VERITAS is an automatic tool developed for KBS verification which is able to detect a large number of knowledge anomalies. It addresses many relevant aspects considered in real applications, like the usage of rule triggering selection mechanisms and temporal reasoning.
Resumo:
In competitive electricity markets with deep concerns for the efficiency level, demand response programs gain considerable significance. As demand response levels have decreased after the introduction of competition in the power industry, new approaches are required to take full advantage of demand response opportunities. This paper presents DemSi, a demand response simulator that allows studying demand response actions and schemes in distribution networks. It undertakes the technical validation of the solution using realistic network simulation based on PSCAD. The use of DemSi by a retailer in a situation of energy shortage, is presented. Load reduction is obtained using a consumer based price elasticity approach supported by real time pricing. Non-linear programming is used to maximize the retailer’s profit, determining the optimal solution for each envisaged load reduction. The solution determines the price variations considering two different approaches, price variations determined for each individual consumer or for each consumer type, allowing to prove that the approach used does not significantly influence the retailer’s profit. The paper presents a case study in a 33 bus distribution network with 5 distinct consumer types. The obtained results and conclusions show the adequacy of the used methodology and its importance for supporting retailers’ decision making.
Resumo:
It is difficult to get the decision about an opinion after many users get the meeting in same place. It used to spend too much time in order to find solve some problem because of the various opinions of each other. TAmI (Group Decision Making Toolkit) is the System to Group Decision in Ambient Intelligence [1]. This program was composed with IGATA [2], WebMeeting and the related Database system. But, because it is sent without any encryption in IP / Password, it can be opened to attacker. They can use the IP / Password to the bad purpose. As the result, although they make the wrong result, the joined member can’t know them. Therefore, in this paper, we studied the applying method of user’s authentication into TAmI.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
Introduction: Although relative uptake values aren’t the most important objective of a 99mTc-DMSA scan, they are important quantitative information. In most of the dynamic renal scintigraphies attenuation correction is essential if one wants to obtain a reliable result of the quantification process. Although in DMSA scans the absent of significant background and the lesser attenuation in pediatric patients, makes that this attenuation correction techniques are actually not applied. The geometric mean is the most common method, but that includes the acquisition of an anterior (extra) projection, which it is not acquired by a large number of NM departments. This method and the attenuation factors proposed by Tonnesen will be correlated with the absence of attenuation correction procedures. Material and Methods: Images from 20 individuals (aged 3 years +/- 2) were used and the two attenuation correction methods applied. The mean time of acquisition (time post DMSA administration) was 3.5 hours +/- 0.8h. Results: The absence of attenuation correction showed a good correlation with both attenuation methods (r=0.73 +/- 0.11) and the mean difference verified on the uptake values between the different methods were 4 +/- 3. The correlation was higher when the age was lower. The attenuation correction methods correlation was higher between them two than with the “no attenuation correction” method (r=0.82 +/- 0.8), and the mean differences of the uptake values were 2 +/- 2. Conclusion: The decision of not doing any kind of attenuation correction method can be justified by the minor differences verified on the relative kidney uptake values. Nevertheless, if it is recognized that there is a need for an accurate value of the relative kidney uptake, then an attenuation correction method should be used. Attenuation correction factors proposed by Tonnesen can be easily implemented and so become a practical and easy to implement alternative, namely when the anterior projection - needed for the geometric mean methodology – is not acquired.