992 resultados para Recursive real numbers


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper was to determine the 10-HDA in pure royal jelly and products containing royal jelly, using HPLC methodology. 10-HDA is the natural indicator of the presence of royal jelly in products and also gives the authenticity of pure royal jelly. The chromatographic conditions used were: isocratic system, C18-H column, auto sampler, diode array UV-VIS detector (225 nm), mobile phase with methanol/water (45:55), pH= 2.5 and a-naphtol as internal standard. The results obtained using laboratory samples for pure royal jelly were 2.37%, varying from 0.15% for honey with 10% of royal jelly to 2.10% for honey with 90% of royal jelly respectivelly. For commercial products, the 10-HDA content varied from no detectable to 0.026%. The recovery test presented a minumum of 100.44% The detection limit was 45.92 ng/mL and the quantification limit was 76.53 ng/mL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper re-examines the null of stationary of real exchange rate for a panel of seventeen OECD developed countries during the post-Bretton Woods era. Our analysis simultaneously considers both the presence of cross-section dependence and multiple structural breaks that have not received much attention in previous panel methods of long-run PPP. Empirical results indicate that there is little evidence in favor of PPP hypothesis when the analysis does not account for structural breaks. This conclusion is reversed when structural breaks are considered in computation of the panel statistics. We also compute point estimates of half-life separately for idiosyncratic and common factor components and find that it is always below one year.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabajo de fin de grado intenta, mediante el caso real de la implantación de un ERP en una micro PYME, ajustar a la realidad socioeconómica de la gran mayoría de empresas españolas las técnicas de implantación de software y de gestión de proyectos, usando como elemento conductor un ERP de software libre, para conseguir el triple objetivo de mantener los ratios de calidad deseados en cualquier proyecto de ingeniería de software, con unos costes asumibles por una micro PYME y satisfaciendo las necesidades de negocio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Joints intended for welding frequently show variations in geometry and position, for which it is unfortunately not possible to apply a single set of operating parameters to ensure constant quality. The cause of this difficulty lies in a number of factors, including inaccurate joint preparation and joint fit up, tack welds, as well as thermal distortion of the workpiece. In plasma arc keyhole welding of butt joints, deviations in the gap width may cause weld defects such as an incomplete weld bead, excessive penetration and burn through. Manual adjustment of welding parameters to compensate for variations in the gap width is very difficult, and unsatisfactory weld quality is often obtained. In this study a control system for plasma arc keyhole welding has been developed and used to study the effects of the real time control of welding parameters on gap tolerance during welding of austenitic stainless steel AISI 304L. The welding tests demonstrated the beneficial effect of real time control on weld quality. Compared with welding using constant parameters, the maximum tolerable gap width with an acceptable weld quality was 47% higher when using the real time controlled parameters for a plate thickness of 5 mm. In addition, burn through occurred with significantly larger gap widths when parameters were controlled in real time. Increased gap tolerance enables joints to be prepared and fit up less accurately, saving time and preparation costs for welding. In addition to the control system, a novel technique for back face monitoring is described in this study. The test results showed that the technique could be successfully applied for penetration monitoring when welding non magnetic materials. The results also imply that it is possible to measure the dimensions of the plasma efflux or weld root, and use this information in a feedback control system and, thus, maintain the required weld quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is done as a complementary part for the active magnet bearing (AMB) control software development project in Lappeenranta University of Technology. The main focus of the thesis is to examine an idea of a real-time operating system (RTOS) framework that operates in a dedicated digital signal processor (DSP) environment. General use real-time operating systems do not necessarily provide sufficient platform for periodic control algorithm utilisation. In addition, application program interfaces found in real-time operating systems are commonly non-existent or provided as chip-support libraries, thus hindering platform independent software development. Hence, two divergent real-time operating systems and additional periodic extension software with the framework design are examined to find solutions for the research problems. The research is discharged by; tracing the selected real-time operating system, formulating requirements for the system, and designing the real-time operating system framework (OSFW). The OSFW is formed by programming the framework and conjoining the outcome with the RTOS and the periodic extension. The system is tested and functionality of the software is evaluated in theoretical context of the Rate Monotonic Scheduling (RMS) theory. The performance of the OSFW and substance of the approach are discussed in contrast to the research theme. The findings of the thesis demonstrates that the forged real-time operating system framework is a viable groundwork solution for periodic control applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show how the familiar phenomenological way of combining the Q2 (photon virtuality) and t (squared momentum transfer) dependences of the scattering amplitude in Deeply Virtual Compton Scattering (DVCS) [1, 2] and Vector Meson Production (VMP) [2] processes can be understood in an off-mass-shell generalization of dual amplitudes with Mandelstam analyticity [3]. By comparing different approaches, we managed also to constrain the numerical values of the free parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes the process of the integration of a real-time simulator environment with a motion platform and a haptic device as a part of the Kvalive project. Several programs running on two computers were made to control the different devices of the environment. User tests were made to obtain information of needed improvements to make the simulator more realistic. Also new ideas for improving the simulator and directions of further research were obtained with the help of this research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis introduces a real-time simulation environment based on the multibody simulation approach. The environment consists of components that are used in conventional product development, including computer aided drawing, visualization, dynamic simulation and finite element software architecture, data transfer and haptics. These components are combined to perform as a coupled system on one platform. The environment is used to simulate mobile and industrial machines at different stages of a product life time. Consequently, the demands of the simulated scenarios vary. In this thesis, a real-time simulation environment based on the multibody approach is used to study a reel mechanism of a paper machine and a gantry crane. These case systems are used to demonstrate the usability of the real-time simulation environment for fault detection purposes and in the context of a training simulator. In order to describe the dynamical performance of a mobile or industrial machine, the nonlinear equations of motion must be defined. In this thesis, the dynamical behaviour of machines is modelled using the multibody simulation approach. A multibody system may consist of rigid and flexible bodies which are joined using kinematic joint constraints while force components are used to describe the actuators. The strength of multibody dynamics relies upon its ability to describe nonlinearities arising from wearing of the components, friction, large rotations or contact forces in a systematic manner. For this reason, the interfaces between subsystems such as mechanics, hydraulics and control systems of the mechatronic machine can be defined and analyzed in a straightforward manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The provision of Internet access to large numbers has traditionally been under the control of operators, who have built closed access networks for connecting customers. As the access network (i.e. the last mile to the customer) is generally the most expensive part of the network because of the vast amount of cable required, many operators have been reluctant to build access networks in rural areas. There are problems also in urban areas, as incumbent operators may use various tactics to make it difficult for competitors to enter the market. Open access networking, where the goal is to connect multiple operators and other types of service providers to a shared network, changes the way in which networks are used. This change in network structure dismantles vertical integration in service provision and enables true competition as no service provider can prevent others fromcompeting in the open access network. This thesis describes the development from traditional closed access networks towards open access networking and analyses different types of open access solution. The thesis introduces a new open access network approach (The Lappeenranta Model) in greater detail. The Lappeenranta Model is compared to other types of open access networks. The thesis shows that end users and service providers see local open access and services as beneficial. In addition, the thesis discusses open access networking in a multidisciplinary fashion, focusing on the real-world challenges of open access networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A generalized off-shell unitarity relation for the two-body scattering T matrix in a many-body medium at finite temperature is derived, through a consistent real-time perturbation expansion by means of Feynman diagrams. We comment on perturbation schemes at finite temperature in connection with an erroneous formulation of the Dyson equation in a paper recently published.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En agosto de 1868 un equipo de geómetras de la Junta General de Estadística, dirigido por José Giralt Torner, inició los trabajos de campo necesarios para proceder a una representación cartográfica fiable del Real Sitio de Riofrío (Segovia). El levantamiento formaba parte de un proyecto más ambicioso, que tenía por objeto el inventario y deslinde del patrimonio territorial de la Corona. Los planos formados por los geómetras de la Junta de Estadística, que se conservan inéditos en el archivo del Instituto Geográfico Nacional, constituyen un conjunto documental de notable valor geohistório, sin posible parangón entre las fuentes iconográficas del siglo XIX. Este artículo, que se apoya esencialmente en fuentes primarias, da cuenta de la cartografía conservada, describe la técnica de levantamiento, e identifica a los protagonistas de tales trabajos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the trajectories of the real unit labour costs (RULCs) in a selection of Eurozone economies. Strong asymmetries in the convergence process of the RULCs and its components —real wages, capital intensity, and technology— are uncovered through decomposition and cluster analyses. In the last three decades, the PIIGS (Portugal, Ireland, Italy, Greece, and Spain) succeeded in reducing their RULCs by more than their northern partners. With the exception of Ireland, however, technological progress was weak; it was through capital intensification that periphery economies gained efficiency and competitiveness. Cluster heterogeneity, and lack of robustness in cluster composition, is a reflection of the difficulties in achieving real convergence and, by extension, nominal convergence. We conclude by outlining technology as the key convergence factor, and call for a renewed attention to real convergence indicators to strengthen the process of European integration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past decades drug discovery practice has escaped from the complexity of the formerly used phenotypic screening in animals to focus on assessing drug effects on isolated protein targets in the search for drugs that exclusively and potently hit one selected target, thought to be critical for a given disease, while not affecting at all any other target to avoid the occurrence of side-effects. However, reality does not conform to these expectations, and, conversely, this approach has been concurrent with increased attrition figures in late-stage clinical trials, precisely due to lack of efficacy and safety. In this context, a network biology perspective of human disease and treatment has burst into the drug discovery scenario to bring it back to the consideration of the complexity of living organisms and particularly of the (patho)physiological environment where protein targets are (mal)functioning and where drugs have to exert their restoring action. Under this perspective, it has been found that usually there is not one but several disease-causing genes and, therefore, not one but several relevant protein targets to be hit, which do not work on isolation but in a highly interconnected manner, and that most known drugs are inherently promiscuous. In this light, the rationale behind the currently prevailing single-target-based drug discovery approach might even seem a Utopia, while, conversely, the notion that the complexity of human disease must be tackled with complex polypharmacological therapeutic interventions constitutes a difficult-torefuse argument that is spurring the development of multitarget therapies.