920 resultados para Nationalities, Principle of.
Resumo:
A novel electrostatic precipitator CAROLA® is developed for collection of fine oil mists. It operates on the principle of unipolar particle charging in the corona discharge and particle precipitation under the field of space charge. The pilot precipitator was tested at different gas temperatures. It is shown that the increase of gas temperature changes the characteristics of the corona discharge and particle size distribution, especially for droplets sub-micron droplets. The CAROLA® precipitator was used for collection of oil mist from pyrolysis gases at the HALOCLEAN® plant. The flow rate of biomass in the HALOCLEAN® plant was 15-30 kg/h. The particle mass concentration in the raw gas was over 100 g/Nm. The operation voltage of the precipitator was 10-12 kV and corona current up to 0,1 mA. Single stage electrostatic precipitator ensured mass collection efficiency 97-99,5% for pyrolysis oil mist.
Resumo:
This investigation originated from work by Dr. A.H. McIlraith of the National Physical Laboratory who, in 1966, described a new type of charged particle oscillator. This makes use of two equal cylindrical electrodes to constrain the particles in such a way that they follow extremely long oscillatory paths between the electrodes under the influence of an electrostatic field alone. The object of this work has been to study the principle of the oscillator in detail and to investigate its properties and applications. Any device which is capable of creating long electron trajectories has potential application in the field of ultra high vacuum technology. It was therefore considered that a critical review of the problems associated with the production and measurement of ultra high vacuum was relevant in the initial stages of the work. The oscillator has been applied with a considerable degree of success as a high energy electrostatic ion source. This offers several advantages over existing ion sources. It can be operated at much lower pressures without the need of a magnetic field. The oscillator principle has also been applied as a thermionic ionization gauge and has been compared with other ionization gauges to pressures as low as 5 x 10- 11 torr.. This new gauge exhibited a number of advantages over most of the existing gauges. Finally the oscillator has been used in an evaporation ion pump and has exhibited fairly high pumping speeds for argon gas relative to those for nitrogen. This investigation supports the original work of Dr. A.H. McIlraith and shows that his proposed oscillator has considerable potential in the fields of vacuum technology and electron physics.
Resumo:
Patient and public involvement has been at the heart of UK health policy for more than two decades. This commitment to putting patients at the heart of the British National Health Service (NHS) has become a central principle helping to ensure equity, patient safety and effectiveness in the health system. The recent Health and Social Care Act 2012 is the most significant reform of the NHS since its foundation in 1948. More radically, this legislation undermines the principle of patient and public involvement, public accountability and returns the power for prioritisation of health services to an unaccountable medical elite. This legislation marks a sea-change in the approach to patient and public involvement in the UK and signals a shift in the commitment of the UK government to patient-centred care. © 2013 John Wiley & Sons Ltd.
Resumo:
Development-engineers use in their work languages intended for software or hardware systems design, and test engineers utilize languages effective in verification, analysis of the systems properties and testing. Automatic interfaces between languages of these kinds are necessary in order to avoid ambiguous understanding of specification of models of the systems and inconsistencies in the initial requirements for the systems development. Algorithm of automatic translation of MSC (Message Sequence Chart) diagrams compliant with MSC’2000 standard into Petri Nets is suggested in this paper. Each input MSC diagram is translated into Petri Net (PN), obtained PNs are sequentially composed in order to synthesize a whole system in one final combined PN. The principle of such composition is defined through the basic element of MSC language — conditions. While translating reference table is developed for maintenance of consistent coordination between the input system’s descriptions in MSC language and in PN format. This table is necessary to present the results of analysis and verification on PN in suitable for the development-engineer format of MSC diagrams. The proof of algorithm correctness is based on the use of process algebra ACP. The most significant feature of the given algorithm is the way of handling of conditions. The direction for future work is the development of integral, partially or completely automated technological process, which will allow designing system, testing and verifying its various properties in the one frame.
Resumo:
MSC 2010: 26A33, 70H25, 46F12, 34K37 Dedicated to 80-th birthday of Prof. Rudolf Gorenflo
Resumo:
Big data comes in various ways, types, shapes, forms and sizes. Indeed, almost all areas of science, technology, medicine, public health, economics, business, linguistics and social science are bombarded by ever increasing flows of data begging to be analyzed efficiently and effectively. In this paper, we propose a rough idea of a possible taxonomy of big data, along with some of the most commonly used tools for handling each particular category of bigness. The dimensionality p of the input space and the sample size n are usually the main ingredients in the characterization of data bigness. The specific statistical machine learning technique used to handle a particular big data set will depend on which category it falls in within the bigness taxonomy. Large p small n data sets for instance require a different set of tools from the large n small p variety. Among other tools, we discuss Preprocessing, Standardization, Imputation, Projection, Regularization, Penalization, Compression, Reduction, Selection, Kernelization, Hybridization, Parallelization, Aggregation, Randomization, Replication, Sequentialization. Indeed, it is important to emphasize right away that the so-called no free lunch theorem applies here, in the sense that there is no universally superior method that outperforms all other methods on all categories of bigness. It is also important to stress the fact that simplicity in the sense of Ockham’s razor non-plurality principle of parsimony tends to reign supreme when it comes to massive data. We conclude with a comparison of the predictive performance of some of the most commonly used methods on a few data sets.
Resumo:
2000 Mathematics Subject Classification: 62G07, 62L20.
Resumo:
Electrically excited synchronous machines with brushes and slip rings are popular but hardly used in inflammable and explosive environments. This paper proposes a new brushless electrically excited synchronous motor with a hybrid rotor. It eliminates the use of brushes and slip rings so as to improve the reliability and cost-effectiveness of the traction drive. The proposed motor is characterized with two sets of stator windings with two different pole numbers to provide excitation and drive torque independently. This paper introduces the structure and operating principle of the machine, followed by the analysis of the air-gap magnetic field using the finite-element method. The influence of the excitation winding's pole number on the coupling capability is studied and the operating characteristics of the machine are simulated. These are further examined by the experimental tests on a 16 kW prototype motor. The machine is proved to have good static and dynamic performance, which meets the stringent requirements for traction applications.
Resumo:
A szerző röviden összefoglalja a származtatott termékek árazásával kapcsolatos legfontosabb ismereteket és problémákat. A derivatív árazás elmélete a piacon levő termékek közötti redundanciát kihasználva próbálja meghatározni az egyes termékek relatív árát. Ezt azonban csak teljes piacon lehet megtenni, és így csak teljes piac esetén lehetséges a hasznossági függvények fogalmát az elméletből és a ráépülő gyakorlatból elhagyni, ezért a kockázatsemleges árazás elve félrevezető. Másképpen fogalmazva: a származtatott termékek elmélete csak azon az áron képes a hasznossági függvény fogalmától megszabadulni, ha a piac szerkezetére a valóságban nem teljesülő megkötéseket tesz. Ennek hangsúlyozása mind a piaci gyakorlatban, mind az oktatásban elengedhetetlen. / === / The author sums up briefly the main aspects and problems to do with the pricing of derived products. The theory of derivative pricing uses the redundancy among products on the market to arrive at relative product prices. But this can be done only on a complete market, so that only with a complete market does it become possible to omit from the theory and the practice built upon it the concept of utility functions, and for that reason the principle of risk-neutral pricing is misleading. To put it another way, the theory of derived products is capable of freeing itself from the concept of utility functions only at a price where in practice it places impossible restrictions on the market structure. This it is essential to emphasize in market practice and in teaching.
Resumo:
Our aim was to approach an important and well-investigable phenomenon – connected to a relatively simple but real field situation – in such a way, that the results of field observations could be directly comparable with the predictions of a simulation model-system which uses a simple mathematical apparatus and to simultaneously gain such a hypothesis-system, which creates the theoretical opportunity for a later experimental series of studies. As a phenomenon of the study, we chose the seasonal coenological changes of aquatic and semiaquatic Heteroptera community. Based on the observed data, we developed such an ecological model-system, which is suitable for generating realistic patterns highly resembling to the observed temporal patterns, and by the help of which predictions can be given to alternative situations of climatic circumstances not experienced before (e.g. climate changes), and furthermore; which can simulate experimental circumstances. The stable coenological state-plane, which was constructed based on the principle of indirect ordination is suitable for unified handling of data series of monitoring and simulation, and also fits for their comparison. On the state-plane, such deviations of empirical and model-generated data can be observed and analysed, which could otherwise remain hidden.
Resumo:
Combinatorial designs are used for designing key predistribution schemes that are applied to wireless sensor networks in communications. This helps in building a secure channel. Private-key cryptography helps to determine a common key between a pair of nodes in sensor networks. Wireless sensor networks using key predistribution schemes have many useful applications in military and civil operations. When designs are efficiently implemented on sensor networks, blocks with unique keys will be the result. One such implementation is a transversal design which follows the principle of simple key establishment. Analysis of designs and modeling the key schemes are the subjects of this project.
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.
Resumo:
The aim of this Thesis work is to study the multi-frequency properties of the Ultra Luminous Infrared Galaxy (ULIRG) IRAS 00183-7111 (I00183) at z = 0.327, connecting ALMA sub-mm/mm observations with those at high energies in order to place constraints on the properties of its central power source and verify whether the gas traced by the CO may be responsible for the obscuration observed in X-rays. I00183 was selected from the so-called Spoon diagnostic diagram (Spoon et al. 2007) for mid-infrared spectra of infrared galaxies based on the equivalent width of the 6.2 μm Polycyclic Aromatic Hydrocarbon (PAH) emission feature versus the 9.7 μm silicate strength. Such features are a powerful tool to investigate the contribution of star formation and AGN activity in this class of objects. I00183 was selected from the top-left region of the plot where the most obscured sources, characterized by a strong Si absorption feature, are located. To link the sub-mm/mm to the X-ray properties of I00183, ALMA archival Cycle 0 data in Band 3 (87 GHz) and Band 6 (270 GHz) have been calibrated and analyzed, using CASA software. ALMA Cycle 0 was the Early Science program for which data reprocessing is strongly suggested. The main work of this Thesis consisted in reprocessing raw data to provide an improvement with respect to the available archival products and results, which were obtained using standard procedures. The high-energy data consists of Chandra, XMM-Newton and NuSTAR observations which provide a broad coverage of the spectrum in the energy range 0.5 − 30 keV. Chandra and XMM archival data were used, with an exposure time of 22 and 22.2 ks, respectively; their reduction was carried out using CIAO and SAS software. The 100 ks NuSTAR are still private and the spectra were obtained by courtesy of the PI (K. Iwasawa). A detailed spectral analysis was done using XSPEC software; the spectral shape was reproduced starting from simple phenomenological models, and then more physical models were introduced to account for the complex mechanisms that involve this source. In Chapter 1, an overview of the scientific background is discussed, with a focus on the target, I00183, and the Spoon diagnostic diagram, from which it was originally selected. In Chapter 2, the basic principles of interferometry are briefly introduced, with a description of the calibration theory applied to interferometric observations. In Chapter 3, ALMA and its capabilities, both current and future, are shown, explaining also the complex structure of the ALMA archive. In Chapter 4, the calibration of ALMA data is presented and discussed, showing also the obtained imaging products. In Chapter 5, the analysis and discussion of the main results obtained from ALMA data is presented. In Chapter 6, the X-ray observations, data reduction and spectral analysis are reported, with a brief introduction to the basic principle of X-ray astronomy and the instruments from which the observations were carried out. Finally, the overall work is summarized, with particular emphasis on the main obtained results and the possible future perspectives.
Resumo:
Laser trackers have been widely used in many industries to meet increasingly high accuracy requirements. In laser tracker measurement, it is complex and difficult to perform an accurate error analysis and uncertainty evaluation. This paper firstly reviews the working principle of single beam laser trackers and state-of- The- Art of key technologies from both industrial and academic efforts, followed by a comprehensive analysis of uncertainty sources. A generic laser tracker modelling method is formulated and the framework of the virtual tracker is proposed. The VLS can be used for measurement planning, measurement accuracy optimization and uncertainty evaluation. The completed virtual laser tracking system should take all the uncertainty sources affecting coordinate measurement into consideration and establish an uncertainty model which will behave in an identical way to the real system. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
“Spaces of Order” argues that the African novel should be studied as a revolutionary form characterized by aesthetic innovations that are not comprehensible in terms of the novel’s European archive of forms. It does this by mapping an African spatial order that undermines the spatial problematic at the formal and ideological core of the novel—the split between a private, subjective interior, and an abstract, impersonal outside. The project opens with an examination of spatial fragmentation as figured in the “endless forest” of Amos Tutuola’s The Palmwine Drinkard (1952). The second chapter studies Chinua Achebe’s Things Fall Apart (1958) as a fictional world built around a peculiar category of space, the “evil forest,” which constitutes an African principle of order and modality of power. Chapter three returns to Tutuola via Ben Okri’s The Famished Road (1991) and shows how the dispersal of fragmentary spaces of exclusion and terror within the colonial African city helps us conceive of political imaginaries outside the nation and other forms of liberal political communities. The fourth chapter shows Nnedi Okorafor—in her 2014 science-fiction novel Lagoon—rewriting Things Fall Apart as an alien-encounter narrative in which Africa is center-stage of a planetary, multi-species drama. Spaces of Order is a study of the African novel as a new logic of world making altogether.