7 resultados para Continuous-time Markov Process
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The thesis presents the body poetry and its inscribing in the myth and Butô dance. The argumentation highlights the sensitive dimension present in these itineraries, as a possibility to operate the emergency of knowing inscribed in the body, bringing a kind of rationality that links the fragments, that allows the knowing to break through the barriers of disciplinary isolation, that abandons the certainties and goes through the ways of creation and that gives the body new space and time, featuring epistemological elements, ethical and esthetical, that can permit a sensitive education. All along the way, we comprehend by sensitive education, a education that considers the relinking of logical, analogical, symbolic and artistic knowledge and therefore reconsiders the own act of knowing as a continuous and inconcluded process. That sensitive education is also understood as retaking the body experience, its sensitive nature, as well as being meaningful to reading the world. It includes the body memory, its history and creativity, opening it to innovation, change, sense amplification and dialogue with other bodies and world, because it is within them. It is about an investigation of phenomenologic nature, that dialogues philosophy and art, pointing breakdowns of this reflection foe the body and education studies. We find it necessary to notice the body language, that allows one to think through movements, articulate a thought that is risen from articulations, guts and all the body. This incarnated reason starts the expressive body action, that makes us move to mean, communicate, inaugurate senses. Among these senses, we present a possibility of approach of the elements of Butô dance teaching and physical education, as ways of sensitive education showings of body poetry
Resumo:
In recent years, the Brazilian construction industry has gone by changes like currency stability, increasing competition, shortage of skilled labor and increasing quality importance required by the customer, who made the sector companies seek solutions through new management practices in order to become more efficient. A alternative to these management practices is known as Lean Construction which is derived from the Toyota System Production. Lean Construction main goals are to reduce parts of activities that do not add value, increase product value by considering customer needs, reduce variability and production cycle time, simplify process by reducing the number of parts or steps, increase the flexibility in the product execution and transparency process, focus the control on overall process, introduce continuous improvement process, maintain a balance between improvements in flows and conversions and seek to learn from practices adopted by competitors. However, the construction industry is characterized by having nomadic activity, which undertakes an unique product with high cost of production and big inertia for behavioral change, making it difficult to implement the philosophy of lean construction in companies. In this sense, the main objective of this study is to develop a methodology for implementation of the principles of Lean Construction. The method of implementing the proposed management system was designed with the aid of 5W2H tool, and the implementation process is divided into three phases. The first one aims to know in a macro way the current operation of construction, identify who is its target audience and what are the products and services offered to the Market. The second phase aims to describe what actions should be taken and which documents are needed to be created or modified; finally, the third step goal consists in how to control and monitor established processes, where through Strategic Planning the company goals would be set along with their respective targets and indicators in order to keep the system working, aiming for continuous improvement with focus on the customer. This methodology was conceived as a case study analyzing a medium size construction with more than 18 years of activity and certified for almost 10 years with ISO9001 and level A in PBQP-H. We also conclude that this implementation process can be used in any developer and / or builder
Resumo:
This study aims to use a computational model that considers the statistical characteristics of the wind and the reliability characteristics of a wind turbine, such as failure rates and repair, representing the wind farm by a Markov process to determine the estimated annual energy generated, and compare it with a real case. This model can also be used in reliability studies, and provides some performance indicators that will help in analyzing the feasibility of setting up a wind farm, once the power curve is known and the availability of wind speed measurements. To validate this model, simulations were done using the database of the wind farm of Macau PETROBRAS. The results were very close to the real, thereby confirming that the model successfully reproduced the behavior of all components involved. Finally, a comparison was made of the results presented by this model, with the result of estimated annual energy considering the modeling of the distribution wind by a statistical distribution of Weibull
Resumo:
The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors
Resumo:
Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process
Resumo:
In the work reported here we present theoretical and numerical results about a Risk Model with Interest Rate and Proportional Reinsurance based on the article Inequalities for the ruin probability in a controlled discrete-time risk process by Ros ario Romera and Maikol Diasparra (see [5]). Recursive and integral equations as well as upper bounds for the Ruin Probability are given considering three di erent approaches, namely, classical Lundberg inequality, Inductive approach and Martingale approach. Density estimation techniques (non-parametrics) are used to derive upper bounds for the Ruin Probability and the algorithms used in the simulation are presented
Resumo:
The great amount of data generated as the result of the automation and process supervision in industry implies in two problems: a big demand of storage in discs and the difficulty in streaming this data through a telecommunications link. The lossy data compression algorithms were born in the 90’s with the goal of solving these problems and, by consequence, industries started to use those algorithms in industrial supervision systems to compress data in real time. These algorithms were projected to eliminate redundant and undesired information in a efficient and simple way. However, those algorithms parameters must be set for each process variable, becoming impracticable to configure this parameters for each variable in case of systems that monitor thousands of them. In that context, this paper propose the algorithm Adaptive Swinging Door Trending that consists in a adaptation of the Swinging Door Trending, as this main parameters are adjusted dynamically by the analysis of the signal tendencies in real time. It’s also proposed a comparative analysis of performance in lossy data compression algorithms applied on time series process variables and dynamometer cards. The algorithms used to compare were the piecewise linear and the transforms.