996 resultados para Uniform Commercial Code


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mobile networks of earlier and current generations, or 2G and 3G networks, provide users voice and packet services with higher transmission rates and good quality over the same core network. When developing the next generation of mobile networks the current quality of services needs to be maintained. This thesis concentrates on the next generation mobile network, especially on the evolution of the packet network part. The new mobile network has requirements for the common packet backbone network, Mobile Packet Backbone Network, which is additionally discussed in this study. The next generation mobile network, called LTE/SAE, is currently under testing. The test system is called Container Trial System. It is a mini sized LTE/SAE site. The LTE/SAE is studied in this thesis concentrating on the evolved packet core, the SAE part of the composition. The empirical part of the study compares the LTE/SAE Container Trial System and commercial network designs and additionally produces documentation for internal personnel and customers. The research is performed by comparing the documentations and specifications of both the Container Trial System and commercial network. Since the LTE commercial network is not yet constructed, the comparison is done theoretically. The purpose is furthermore to find out if there are any design issues that could be done differently in the next version of the Container Trial System.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The behavior of the nuclear power plants must be known in all operational situations. Thermal hydraulics computer applications are used to simulate the behavior of the plants. The computer applications must be validated before they can be used reliably. The simulation results are compared against the experimental results. In this thesis a model of the PWR PACTEL steam generator was prepared with the TRAC/RELAP Advanced Computational Engine computer application. The simulation results can be compared against the results of the Advanced Process Simulator analysis software in future. Development of the model of the PWR PACTEL vertical steam generator is introduced in this thesis. Loss of feedwater transient simulation examples were carried out with the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Water-in-crude oil emulsions are formed during petroleum production and asphaltenes play an important role in their stabilization. Demulsifiers are added to destabilize such emulsions,however the demulsification mechanism is not completely known. In this paper, the performances of commercial poly(ethylene oxide-b-propylene oxide) demulsifiers were studied using synthetic water-in-oil emulsions and model-systems (asphaltenes in organic solvent). No change in the asphaltene aggregate size induced by the demulsifier was observed. The demulsification performance decreased as the asphaltene aggregate size increased, so it can be suggested that the demulsification mechanism is correlated to the voids between the aggregates adsorbed on the water droplets surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transitional flow past a three-dimensional circular cylinder is a widely studied phenomenon since this problem is of interest with respect to many technical applications. In the present work, the numerical simulation of flow past a circular cylinder, performed by using a commercial CFD code (ANSYS Fluent 12.1) with large eddy simulation (LES) and RANS (κ - ε and Shear-Stress Transport (SST) κ - ω! model) approaches. The turbulent flow for ReD = 1000 & 3900 is simulated to investigate the force coefficient, Strouhal number, flow separation angle, pressure distribution on cylinder and the complex three dimensional vortex shedding of the cylinder wake region. The numerical results extracted from these simulations have good agreement with the experimental data (Zdravkovich, 1997). Moreover, grid refinement and time-step influence have been examined. Numerical calculations of turbulent cross-flow in a staggered tube bundle continues to attract interest due to its importance in the engineering application as well as the fact that this complex flow represents a challenging problem for CFD. In the present work a time dependent simulation using κ – ε, κ - ω! and SST models are performed in two dimensional for a subcritical flow through a staggered tube bundle. The predicted turbulence statistics (mean and r.m.s velocities) have good agreement with the experimental data (S. Balabani, 1996). Turbulent quantities such as turbulent kinetic energy and dissipation rate are predicted using RANS models and compared with each other. The sensitivity of grid and time-step size have been analyzed. Model constants sensitivity study have been carried out by adopting κ – ε model. It has been observed that model constants are very sensitive to turbulence statistics and turbulent quantities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One filler often utilized in flexible polyurethane foams is calcium carbonate (CaCO3) because it is non-abrasiveness, non-toxicity and facilitated pigmentation. However, it is observed that the excess of commercial CaCO3 utilized in industry possibly causing permanent deformations and damaging the quality of the final product. The effect of different concentrations of commercial CaCO3, in flexible foams, was studied. Different concentrations of CaCO3 were used for the synthesis of flexible polyurethane foams, which were submitted to morphological and thermal analyses to verify the alterations provoked by the progressive introduction of this filler.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to increasing waterborne transportation in the Gulf of Finland, the risk of a hazardous accident increases and therefore manifold preventive actions are needed. As a main legislative authority in the maritime community, The International Maritime Organization (IMO) has set down plenary laws and recommendations which are e.g., utilised in the safe operations in ships and pollution prevention. One of these compulsory requirements, the ISM Code, requires proactive attitude both from the top management and operational workers in the shipping companies. In this study, a crosssectional approach was taken to analyse whether the ISM Code has actively enhanced maritime safety in the Gulf of Finland. The analysis included; 1) performance of the ISM Code in Finnish shipping companies, 2) statistical measurements of maritime safety, 3) influence of corporate top management to the safety culture and 4) comparing safety management practices in shipping companies and port operations of Finnish maritime and port authorities. The main results found were that maritime safety culture has developed in the right direction after the launch of the ISM Code in the 1990´s. However, this study does not exclusively prove that the improvements are the consequence of the ISM Code. Accident prone ships can be recognized due to their behaviour and there is a lesson to learn from the safety culture of some high standard safety disciplines such as, air traffic. In addition, the reporting of accidents and nearmisses should be more widely used in shipping industry. In conclusion, there is still much to be improved in the maritime safety culture of the Finnish Shipping industry, e.g., a “no blame culture” needs to be adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The properties of the paper surface play a crucial role in ensuring suitable quality and runnability in various converting and finishing operations, such as printing. Plasma surface modification makes it possible to modify the surface chemistry of paper without altering the bulk material properties. This also makes it possible to investigate the role of the surface chemistry alone on printability without influencing the porous structure of the pigment-coated paper. Since the porous structure of a pigment coating controls both ink setting and optical properties, surface chemical changes created by a plasma modification have a potential to decouple these two effects and to permit a better optimization of them both. The aim of this work was to understand the effects of plasma surface modification on paper properties, and how it influences printability in the sheet-fed offset process. The objective was to broaden the fundamental understanding of the role of surface chemistry on offset printing. The effects of changing the hydrophilicity/ hydrophobicity and the surface chemical composition by plasma activation and plasma coatings on the properties of coated paper and on ink-paper interactions as well as on sheet-fed offset print quality were investigated. In addition, the durability of the plasma surface modification was studied. Nowadays, a typical sheet-fed offset press also contains units for surface finishing, for example UVvarnishing. The role of the surface chemistry on the UV-varnish absorption into highly permeable and porous pigment-coated paper was also investigated. With plasma activation it was possible to increase the surface energy and hydrophilicity of paper. Both polar and dispersion interactions were found to increase, although the change was greater in the polar interactions due to induced oxygen molecular groups. The results indicated that plasma activation takes place particularly in high molecular weight components such as the dispersion chemicals used to stabilize the pigment and latex particles. Surface composition, such as pigment and binder type, was found to influence the response to the plasma activation. The general trend was that pilot-scale treatment modified the surface chemistry without altering the physical coating structure, whereas excessive laboratory-scale treatment increased the surface roughness and reduced the surface strength, which led to micro-picking in printing. It was shown that pilot-scale plasma activation in combination with appropriate ink oils makes it possible to adjust the ink-setting rate. The ink-setting rate decreased with linseed-oil-based inks, probably due to increased acid-base interactions between the polar groups in the oil and the plasma-treated paper surface. With mineral-oil-based inks, the ink setting accelerated due to plasma activation. Hydrophobic plasma coatings were able to reduce or even prevent the absorption of dampening water into pigmentcoated paper, even when the dampening water was applied under the influence of nip pressure. A uniform hydrophobic plasma coating with sufficient chemical affinity with ink gave an improved print quality in terms of higher print density and lower print mottle. It was also shown that a fluorocarbon plasma coating reduced the free wetting of the UV-varnish into the highly permeable and porous pigment coating. However, when the UV-varnish was applied under the influence of nip pressure, which leads to forced wetting, the role of the surface chemical composition seems to be much less. A decay in surface energy and wettability occurred during the first weeks of storage after plasma activation, after which it leveled off. However, the oxygen/carbon elemental ratio did not decrease as a function of time, indicating that ageing could be caused by a re-orientation of polar groups or by a contamination of the surface. The plasma coatings appeared to be more stable when the hydrophobicity was higher, probably due to fewer interactions with oxygen and water vapor in the air.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a study about language and learning aspects in the interaction between pupils and teachers in classrooms, where the majority of the pupils are bilingual. The aim of the dissertation is to develop the understanding of interactional learning possibilities and constraints in relation to a bilingual context. Language related learning is used as an overall conception which covers learning related to classroom discourse, language and subject. The empirical study has been made in a Swedish speaking school in a strongly Finnish dominated environment in the south of Finland. In the material, mainly consisting of video recorded lessons in forms one to three, the interaction between the pupils and the teachers is analysed. Building on a social constructionist perspective, where learning is regarded as a social phenomenon, situated and visible in changing participation, sequences where pupils or teachers make the language relevant are emphasised. The sequences are analysed in line with the conversation analytic (CA) approach. A fundamental result is an understanding of a monolingual classroom discourse, jointly constructed by teachers and pupils and visible in the pupils' interactionally problematized code-switching. This means that the pupils are not victims of a top-driven language policy; they are active co-constructors of the monolingual discourse. Through different repair initiations the pupils are doing interactional work in positioning themselves correctly in the monolingual discourse, which they simultaneously maintain. This work has a price in relation to time, knowledge and exactness. The pupils' problematized code-switching is often directly and shortly repaired by the teachers. This kind of repair promotes the pupils' participation and is not, as opposed to the results of research in everyday talk, dispreferred in pupil-teacher talk. When the pupils use the possibility to, in a comparatively easy way, participate and thus express their knowledge through codeswitching, and simultaneously talk a monolingual discourse into being, the teacher can, through direct repair, show an understanding in regard to the content, facilitate language learning and simultaneously confirm the pupils as competent speakers and bilingual individuals. Furthermore, significant results show that the monolingual norm has a function of a contrasting background which gives the pupils and the teachers a possibility to use language alternation as a functional and meaningful activity. The pupils use codeswitching as a way of protesting or expressing non-participation in the classroom talk. By making the pupils' bilingualism relevant, the teachers express understanding and empathy and encourage the pupils' participation in the classroom talk. Bilingualism is a nonpreferred, but functioning, resource in the interaction between pupils and teachers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to simulate blood flow in thoracic human aorta and understand the role of flow dynamics in the initialization and localization of atherosclerotic plaque in human thoracic aorta. The blood flow dynamics in idealized and realistic models of human thoracic aorta were numerically simulated in three idealized and two realistic thoracic aorta models. The idealized models of thoracic aorta were reconstructed with measurements available from literature, and the realistic models of thoracic aorta were constructed by image processing Computed Tomographic (CT) images. The CT images were made available by South Karelia Central Hospital in Lappeenranta. The reconstruction of thoracic aorta consisted of operations, such as contrast adjustment, image segmentations, and 3D surface rendering. Additional design operations were performed to make the aorta model compatible for the numerical method based computer code. The image processing and design operations were performed with specialized medical image processing software. Pulsatile pressure and velocity boundary conditions were deployed as inlet boundary conditions. The blood flow was assumed homogeneous and incompressible. The blood was assumed to be a Newtonian fluid. The simulations with idealized models of thoracic aorta were carried out with Finite Element Method based computer code, while the simulations with realistic models of thoracic aorta were carried out with Finite Volume Method based computer code. Simulations were carried out for four cardiac cycles. The distribution of flow, pressure and Wall Shear Stress (WSS) observed during the fourth cardiac cycle were extensively analyzed. The aim of carrying out the simulations with idealized model was to get an estimate of flow dynamics in a realistic aorta model. The motive behind the choice of three aorta models with distinct features was to understand the dependence of flow dynamics on aorta anatomy. Highly disturbed and nonuniform distribution of velocity and WSS was observed in aortic arch, near brachiocephalic, left common artery, and left subclavian artery. On the other hand, the WSS profiles at the roots of branches show significant differences with geometry variation of aorta and branches. The comparison of instantaneous WSS profiles revealed that the model with straight branching arteries had relatively lower WSS compared to that in the aorta model with curved branches. In addition to this, significant differences were observed in the spatial and temporal profiles of WSS, flow, and pressure. The study with idealized model was extended to study blood flow in thoracic aorta under the effects of hypertension and hypotension. One of the idealized aorta models was modified along with the boundary conditions to mimic the thoracic aorta under the effects of hypertension and hypotension. The results of simulations with realistic models extracted from CT scans demonstrated more realistic flow dynamics than that in the idealized models. During systole, the velocity in ascending aorta was skewed towards the outer wall of aortic arch. The flow develops secondary flow patterns as it moves downstream towards aortic arch. Unlike idealized models, the distribution of flow was nonplanar and heavily guided by the artery anatomy. Flow cavitation was observed in the aorta model which was imaged giving longer branches. This could not be properly observed in the model with imaging containing a shorter length for aortic branches. The flow circulation was also observed in the inner wall of the aortic arch. However, during the diastole, the flow profiles were almost flat and regular due the acceleration of flow at the inlet. The flow profiles were weakly turbulent during the flow reversal. The complex flow patterns caused a non-uniform distribution of WSS. High WSS was distributed at the junction of branches and aortic arch. Low WSS was distributed at the proximal part of the junction, while intermedium WSS was distributed in the distal part of the junction. The pulsatile nature of the inflow caused oscillating WSS at the branch entry region and inner curvature of aortic arch. Based on the WSS distribution in the realistic model, one of the aorta models was altered to induce artificial atherosclerotic plaque at the branch entry region and inner curvature of aortic arch. Atherosclerotic plaque causing 50% blockage of lumen was introduced in brachiocephalic artery, common carotid artery, left subclavian artery, and aortic arch. The aim of this part of the study was first to study the effect of stenosis on flow and WSS distribution, understand the effect of shape of atherosclerotic plaque on flow and WSS distribution, and finally to investigate the effect of lumen blockage severity on flow and WSS distributions. The results revealed that the distribution of WSS is significantly affected by plaque with mere 50% stenosis. The asymmetric shape of stenosis causes higher WSS in branching arteries than in the cases with symmetric plaque. The flow dynamics within thoracic aorta models has been extensively studied and reported here. The effects of pressure and arterial anatomy on the flow dynamic were investigated. The distribution of complex flow and WSS is correlated with the localization of atherosclerosis. With the available results we can conclude that the thoracic aorta, with complex anatomy is the most vulnerable artery for the localization and development of atherosclerosis. The flow dynamics and arterial anatomy play a role in the localization of atherosclerosis. The patient specific image based models can be used to diagnose the locations in the aorta vulnerable to the development of arterial diseases such as atherosclerosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drawing from strategic theory, this study investigates the strategic roles of commercial companies providing military services, frequently referred to as private military companies. Theoretically, the thesis analyzes how states organize its military capabilities in order to be able to wield power within the international system while empirically, it examines the character and role of commercial companies that provide military training services to the United States Government and partner nations. The reason for this rather instrumental and functional, rather than critical, approach is that this work is written within the discipline known as War Studies. Strategic theory is used first to logically organize the empirical findings in two case studies and then to develop an analytical framework with which the strategic roles of companies providing military services can be investigated. The analysis has been conducted using both new and hitherto unknown sources in the shape of interviews as well as previously classified telegrams, but also draws on previous research and other secondary sources. The main findings are that commercial companies have five typical strategic roles: first, they cloak the state by substituting traditional uniformed troops; second, they act as trailblazers by securing US influence in new regions and by breaking new ground by contributing to the build-up of new partners; third, they act as scene setters by preparing the ground for military exit out of a theater of operations or by facilitating inter-operability between foreign militaries and the US military; fourth, they can be used to infiltrate the security structures of foreign countries; fifth and finally, they can be used to provide offensive capabilities by providing either kinetic or cyber warfare effects. Another finding is that military service contracting is an important part of the US strategic culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the thesis is to examine the current state of risk management and to determine an appropriate risk management policy for commercial property derived risks in the Russian branch of a Finnish retail trade company. The employed research methodologies are comparative in-depth interviews and empirical value at risk analysis, including portfolio risk decomposition to determine the inter-currency characteristics. For a multinational retail trade company, the commercial property derived risks open up as a diverse combination of financial and non-financial risks with four distinctive interest groups. The research results indicate that geographical diversification across currency regimes provides diversification benefits. The Russian ruble is the most significant single risk component when considering the net investments outside the euro-zone. Decreasing the Russian ruble and Swedish krona exposures are the most effective methods to reduce translation derived risk. Exchange rate volatility varies over time according to idiosyncratic currency regime characteristics, and cost-effective risk management requires comprehensive analysis of the business environment. Profound and proactive risk management methods are found to be pivotal for companies with cross-border operations in order to succeed among international competitors.