938 resultados para Operating system
Resumo:
A paradigma kifejezést Thomas Kuhn honosította meg a tudományfilozófiában: így nevezte el azt a sajátos szemléletmódot, ahogyan egy kutatási irányzat rátekint vizsgálata tárgyára. Azonos paradigmát használó kutatók hasonló kérdésekre keresik a választ, hasonló módszereket és fogalmakat alkalmaznak. A szerző 1999-ben publikált cikkében vezette be a "rendszerparadigma" kifejezést, amely a társadalomban működő rendszereket helyezi a vizsgálat középpontjába. A tanulmány a posztszocialista átalakulás során szerzett tapasztalatok alapján fejleszti tovább a korábbi cikkben kifejtett elméleti gondolatokat. Az első rész a szocialista és a kapitalista rendszert hasonlítja össze; leírja fő jellemzőiket, majd megállapítja, hogy Észak-Korea és Kuba kivételével az egykori szocialista országokban meghonosodott a kapitalista rendszer. A második rész a politikai-kormányzati formák szerint tipologizálja a kapitalizmus változatait. Három markáns típust különböztet meg: a demokráciát, az autokráciát és a diktatúrát. Huntington a demokratizálás harmadik hullámáról írt. A tanulmány arra a következtetésre jut, hogy ez a hullám elapadt, a 47 posztszocialista ország lakosságának mindössze egytizede él demokráciában, a többiben autokrácia vagy diktatúra uralkodik. A harmadik rész Magyarországra alkalmazza a kialakított fogalmi és elemzési apparátust: itt kapitalizmus van, a politikai-kormányzati forma autokrácia - lényeges közös jellemzők mutathatók ki más kapitalista országokkal, illetve más autokráciákkal. Ez összefér azzal a felismeréssel, hogy egyes - nem alapvető jelentőségű - vonások egyediek, "hungarikumok", különböznek minden más ország tulajdonságaitól. _____ The expression paradigm, introduced into the philosophy of science by Thomas Kuhn for the way a research trend views the subject examined, denotes a case where researchers pursue similar questions by similar methods with similar concepts. The author introduced the expression system paradigm" in a 1999 article centred on the systems operating in society. This paper takes those theoretical ideas further, based on experience in the post-socialist transformation. The first part compares the socialist and capitalist systems and their main features, establishing that all former socialist countries but North Korea and Cuba have embraced the capitalist system. The second adds a typology of the varieties of capitalism by politico-governmental form, marking three types: democracy, autocracy and dictatorship. Huntington writes of a third wave of democratization, which this study concludes has ceased. Democracy reigns in only 10 per cent of the 47 post-socialist countries, autocracy or dictatorship in the others. The third part applies this conceptual and analytical framework to Hungary, where capitalism prevails, with autocracy as its politico-governmental form. It shows strongly similar features to other capitalist countries and other autocracies. This is compatible with recognizing that some features of less than fundamental importance are specific to Hungary and differ from those elsewhere.
Resumo:
This research has explored the relationship between system test complexity and tacit knowledge. It is proposed as part of this thesis, that the process of system testing (comprising of test planning, test development, test execution, test fault analysis, test measurement, and case management), is directly affected by both complexity associated with the system under test, and also by other sources of complexity, independent of the system under test, but related to the wider process of system testing. While a certain amount of knowledge related to the system under test is inherent, tacit in nature, and therefore difficult to make explicit, it has been found that a significant amount of knowledge relating to these other sources of complexity, can indeed be made explicit. While the importance of explicit knowledge has been reinforced by this research, there has been a lack of evidence to suggest that the availability of tacit knowledge to a test team is of any less importance to the process of system testing, when operating in a traditional software development environment. The sentiment was commonly expressed by participants, that even though a considerable amount of explicit knowledge relating to the system is freely available, that a good deal of knowledge relating to the system under test, which is demanded for effective system testing, is actually tacit in nature (approximately 60% of participants operating in a traditional development environment, and 60% of participants operating in an agile development environment, expressed similar sentiments). To cater for the availability of tacit knowledge relating to the system under test, and indeed, both explicit and tacit knowledge required by system testing in general, an appropriate knowledge management structure needs to be in place. This would appear to be required, irrespective of the employed development methodology.
Resumo:
Wireless sensor networks (WSNs) have shown wide applicability to many fields including monitoring of environmental, civil, and industrial settings. WSNs however are resource constrained by many competing factors that span their hardware, software, and networking. One of the central resource constrains is the charge consumption of WSN nodes. With finite energy supplies, low charge consumption is needed to ensure long lifetimes and success of WSNs. This thesis details the design of a power system to support long-term operation of WSNs. The power system’s development occurs in parallel with a custom WSN from the Queen’s MEMS Lab (QML-WSN), with the goal of supporting a 1+ year lifetime without sacrificing functionality. The final power system design utilizes a TPS62740 DC-DC converter with AA alkaline batteries to efficiently supply the nodes while providing battery monitoring functionality and an expansion slot for future development. Testing tools for measuring current draw and charge consumption were created along with analysis and processing software. Through their use charge consumption of the power system was drastically lowered and issues in QML-WSN were identified and resolved including the proper shutdown of accelerometers, and incorrect microcontroller unit (MCU) power pin connection. Controlled current profiling revealed unexpected behaviour of nodes and detailed current-voltage relationships. These relationships were utilized with a lifetime projection model to estimate a lifetime between 521-551 days, depending on the mode of operation. The power system and QML-WSN were tested over a long term trial lasting 272+ days in an industrial testbed to monitor an air compressor pump. Environmental factors were found to influence the behaviour of nodes leading to increased charge consumption, while a node in an office setting was still operating at the conclusion of the trail. This agrees with the lifetime projection and gives a strong indication that a 1+ year lifetime is achievable. Additionally, a light-weight charge consumption model was developed which allows charge consumption information of nodes in a distributed WSN to be monitored. This model was tested in a laboratory setting demonstrating +95% accuracy for high packet reception rate WSNs across varying data rates, battery supply capacities, and runtimes up to full battery depletion.
Resumo:
Future power systems are expected to integrate large-scale stochastic and intermittent generation and load due to reduced use of fossil fuel resources, including renewable energy sources (RES) and electric vehicles (EV). Inclusion of such resources poses challenges for the dynamic stability of synchronous transmission and distribution networks, not least in terms of generation where system inertia may not be wholly governed by large-scale generation but displaced by small-scale and localised generation. Energy storage systems (ESS) can limit the impact of dispersed and distributed generation by offering supporting reserve while accommodating large-scale EV connection; the latter (load) also participating in storage provision. In this paper, a local energy storage system (LESS) is proposed. The structure, requirement and optimal sizing of the LESS are discussed. Three operating modes are detailed, including: 1) storage pack management; 2) normal operation; and 3) contingency operation. The proposed LESS scheme is evaluated using simulation studies based on data obtained from the Northern Ireland regional and residential network.
Resumo:
Utilization of renewable energy sources and energy storage systems is increasing with fostering new policies on energy industries. However, the increase of distributed generation hinders the reliability of power systems. In order to stabilize them, a virtual power plant emerges as a novel power grid management system. The VPP has a role to make a participation of different distributed energy resources and energy storage systems. This paper defines core technology of the VPP which are demand response and ancillary service concerning about Korea, America and Europe cases. It also suggests application solutions of the VPP to V2G market for restructuring national power industries in Korea.
Resumo:
This Master’s thesis examines the implementation of management system standard requirements as integrated in the organization. The aim is to determine how requirements from management system standards ISO 14001:2015 and ISO 9001:2015 can be integrated and implemented into the existing ISO 9001:2008 compliant management system. Research was executed as action research by utilizing an operating model about the integrated use of management system standards created by the International Organization for Standardization. Phases of the operating model were applied to the target organization. The similarity and integration potential of relevant standards were assessed by using comparative matrices. Allocation of the requirements and conformity assessment of the processes was executed by gap analysis. The main results indicate that the requirements of the relevant standards are principally equivalent or have the same kind of purpose. The results also show the most important processes of the target organization in terms of requirement compliance, as well as the requirements which affect the process the most. Prioritizing the compliance achievement of the most important processes and implementation of those requirements that have the most effect create an opportunity for organizations to implement the integrated requirements effectively.
Resumo:
China's water pollution control law stipulates the Water Pollution Discharge Permit (WPDP) institution and authorizes the State Council to draft the regulations for its implementation and enforcement. However, until today, national regulations have not been established and the permitting system has been operating according to provincial regulations. in contrast to USA, the effluents permit system has been operated for more than 40 years and received relatively successful results. The CWA/NPDES experience offers a valuable reference for China’s water permit system.
THE COSTS OF RAISING EQUITY RATIO FOR BANKS Evidence from publicly listed banks operating in Finland
Resumo:
The solvency rate of banks differs from the other corporations. The equity rate of a bank is lower than it is in corporations of other field of business. However, functional banking industry has huge impact on the whole society. The equity rate of a bank needs to be higher because that makes the banking industry more stable as the probability of the banks going under will decrease. If a bank goes belly up, the government will be compensating the deposits since it has granted the bank’s depositors a deposit insurance. This means that the payment comes from the tax payers in the last resort. Economic conversation has long concentrated on the costs of raising equity ratio. It has been a common belief that raising equity ratio also increases the banks’ funding costs in the same phase and these costs will be redistributed to the banks customers as higher service charges. Regardless of the common belief, the actual reaction of the funding costs to the higher equity ratio has been studied only a little in Europe and no study has been constructed in Finland. Before it can be calculated whether the higher stability of the banking industry that is caused by the raise in equity levels compensates the extra costs in funding costs, it must be calculated how much the actual increase in the funding costs is. Currently the banking industry is controlled by complex and heavy regulation. To maintain such a complex system inflicts major costs in itself. This research leans on the Modigliani and Miller theory, which shows that the finance structure of a firm is irrelevant to their funding costs. In addition, this research follows the calculations of Miller, Yang ja Marcheggianon (2012) and Vale (2011) where they calculate the funding costs after the doubling of specific banks’ equity ratios. The Finnish banks studied in this research are Nordea and Danske Bank because they are the two largest banks operating in Finland and they both also have the right company form to able the calculations. To calculate the costs of halving their leverages this study used the Capital Asset Pricing Model. The halving of the leverage of Danske Bank raised its funding costs for 16—257 basis points depending on the method of assessment. For Nordea the increase in funding costs was 11—186 basis points when its leverage was halved. On the behalf of the results found in this study it can be said that the doubling of an equity ratio does not increase the funding costs of a bank one by one. Actually the increase is quite modest. More solvent banks would increase the stability of the banking industry enormously while the increase in funding costs is low. If the costs of bank regulation exceeds the increase in funding costs after the higher equity ratio, it can be thought that this is the better way of stabilizing the banking industry rather than heavy regulation.
Resumo:
The OPIT program is briefly described. OPIT is a basis-set-optimising, self-consistent field, molecular orbital program for calculating properties of closed-shell ground states of atoms and molecules. A file handling technique is then put forward which enables core storage to be used efficiently in large FORTRAN scientific applications programs. Hashing and list processing techniques, of the type frequently used in writing system software and computer operating systems, are here applied to the creation of data files (integral label and value lists etc.). Files consist of a chained series of blocks which may exist in core or on backing store or both. Efficient use of core store is achieved and the processes of file deletion, file re-writing and garbage collection of unused blocks can be easily arranged. The scheme is exemplified with reference to the OPIT program. A subsequent paper will describe a job scheduling scheme for large programs of this sort.
Resumo:
International audience
Resumo:
The search for patterns or motifs in data represents an area of key interest to many researchers. In this paper we present the Motif Tracking Algorithm, a novel immune inspired pattern identification tool that is able to identify unknown motifs which repeat within time series data. The power of the algorithm is derived from its use of a small number of parameters with minimal assumptions. The algorithm searches from a completely neutral perspective that is independent of the data being analysed and the underlying motifs. In this paper the motif tracking algorithm is applied to the search for patterns within sequences of low level system calls between the Linux kernel and the operating system’s user space. The MTA is able to compress data found in large system call data sets to a limited number of motifs which summarise that data. The motifs provide a resource from which a profile of executed processes can be built. The potential for these profiles and new implications for security research are highlighted. A higher level system call language for measuring similarity between patterns of such calls is also suggested.
THE COSTS OF RAISING EQUITY RATIO FOR BANKS Evidence from publicly listed banks operating in Finland
Resumo:
The solvency rate of banks differs from the other corporations. The equity rate of a bank is lower than it is in corporations of other field of business. However, functional banking industry has huge impact on the whole society. The equity rate of a bank needs to be higher because that makes the banking industry more stable as the probability of the banks going under will decrease. If a bank goes belly up, the government will be compensating the deposits since it has granted the bank’s depositors a deposit insurance. This means that the payment comes from the tax payers in the last resort. Economic conversation has long concentrated on the costs of raising equity ratio. It has been a common belief that raising equity ratio also increases the banks’ funding costs in the same phase and these costs will be redistributed to the banks customers as higher service charges. Regardless of the common belief, the actual reaction of the funding costs to the higher equity ratio has been studied only a little in Europe and no study has been constructed in Finland. Before it can be calculated whether the higher stability of the banking industry that is caused by the raise in equity levels compensates the extra costs in funding costs, it must be calculated how much the actual increase in the funding costs is. Currently the banking industry is controlled by complex and heavy regulation. To maintain such a complex system inflicts major costs in itself. This research leans on the Modigliani and Miller theory, which shows that the finance structure of a firm is irrelevant to their funding costs. In addition, this research follows the calculations of Miller, Yang ja Marcheggianon (2012) and Vale (2011) where they calculate the funding costs after the doubling of specific banks’ equity ratios. The Finnish banks studied in this research are Nordea and Danske Bank because they are the two largest banks operating in Finland and they both also have the right company form to able the calculations. To calculate the costs of halving their leverages this study used the Capital Asset Pricing Model. The halving of the leverage of Danske Bank raised its funding costs for 16—257 basis points depending on the method of assessment. For Nordea the increase in funding costs was 11—186 basis points when its leverage was halved. On the behalf of the results found in this study it can be said that the doubling of an equity ratio does not increase the funding costs of a bank one by one. Actually the increase is quite modest. More solvent banks would increase the stability of the banking industry enormously while the increase in funding costs is low. If the costs of bank regulation exceeds the increase in funding costs after the higher equity ratio, it can be thought that this is the better way of stabilizing the banking industry rather than heavy regulation.
Resumo:
Today , Providing drinking water and process water is one of the major problems in most countries ; the surface water often need to be treated to achieve necessary quality, and in this way, technological and also financial difficulties cause great restrictions in operating the treatment units. Although water supply by simple and cheap systems has been one of the important objectives in most scientific and research centers in the world, still a great percent of population in developing countries, especially in rural areas, don't benefit well quality water. One of the big and available sources for providing acceptable water is sea water. There are two ways to treat sea water first evaporation and second reverse osmosis system. Nowadays R.O system has been used for desalination because of low budget price and easily to operate and maintenance. The sea water should be pretreated before R.O plants, because there is some difficulties in raw sea water that can decrease yield point of membranes in R.O system. The subject of this research may be useful in this way, and we hope to be able to achieve complete success in design and construction of useful pretreatment systems for R.O plant. One of the most important units in the sea water pretreatment plant is filtration, the conventional method for filtration is pressurized sand filters, and the subject of this research is about new filtration which is called continuous back wash sand filtration (CBWSF). The CBWSF designed and tested in this research may be used more economically with less difficulty. It consists two main parts first shell body and second central part comprising of airlift pump, raw water feeding pipe, air supply hose, backwash chamber and sand washer as well as inlet and outlet connections. The CBWSF is a continuously operating filter, i.e. the filter does not have to be taken out of operation for backwashing or cleaning. Inlet water is fed through the sand bed while the sand bed is moving downwards. The water gets filtered while the sand becomes dirty. Simultaneously, the dirty sand is cleaned in the sand washer and the suspended solids are discharged in backwash water. We analyze the behavior of CBWSF in pretreatment of sea water instead of pressurized sand filter. There is one important factor which is not suitable for R.O membranes, it is bio-fouling. This factor is defined by Silt Density Index (SDI).measured by SDI. In this research has been focused on decreasing of SDI and NTU. Based on this goal, the prototype of pretreatment had been designed and manufactured to test. The system design was done mainly by using the design fundamentals of CBWSF. The automatic backwash sand filter can be used in small and also big water supply schemes. In big water treatment plants, the units of filters perform the filtration and backwash stages separately, and in small treatment plants, the unit is usually compacted to achieve less energy consumption. The analysis of the system showed that it may be used feasibly for water treating, especially for limited population. The construction is rapid, simple and economic, and its performance is high enough because no mobile mechanical part is used in it, so it may be proposed as an effective method to improve the water quality and consequently the hygiene level in the remote places of the country.
Resumo:
The selection of the optimal operating conditions for an industrial acrylonitrile recovery unit was conducted by the systematic application of the response surface methodology, based on the minimum energy consumption and products specifications as process constraints. Unit models and plant simulation were validated against operating data and information. A sensitivity analysis was carried out in order to identify the set of parameters that strongly affect the trajectories of the system while keeping products specifications. The results suggest that energy savings of up to 10% are possible by systematically adjusting operating conditions.
Resumo:
Part 16: Performance Measurement Systems