921 resultados para CFD, computer modelling, DEM, sugar processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to contribute to the current knowledge-based theory by focusing on a research gap that exists in the empirically proven determination of the simultaneous but differentiable effects of intellectual capital (IC) assets and knowledge management (KM) practices on organisational performance (OP). The analysis was built on the past research and theoreticised interactions between the latent constructs specified using the survey-based items that were measured from a sample of Finnish companies for IC and KM and the dependent construct for OP determined using information available from financial databases. Two widely used and commonly recommended measures in the literature on management science, i.e. the return on total assets (ROA) and the return on equity (ROE), were calculated for OP. Thus the investigation of the relationship between IC and KM impacting OP in relation to the hypotheses founded was possible to conduct using objectively derived performance indicators. Using financial OP measures also strengthened the dynamic features of data needed in analysing simultaneous and causal dependences between the modelled constructs specified using structural path models. The estimates were obtained for the parameters of structural path models using a partial least squares-based regression estimator. Results showed that the path dependencies between IC and OP or KM and OP were always insignificant when analysed separate to any other interactions or indirect effects caused by simultaneous modelling and regardless of the OP measure used that was either ROA or ROE. The dependency between the constructs for KM and IC appeared to be very strong and was always significant when modelled simultaneously with other possible interactions between the constructs and using either ROA or ROE to define OP. This study, however, did not find statistically unambiguous evidence for proving the hypothesised causal mediation effects suggesting, for instance, that the effects of KM practices on OP are mediated by the IC assets. Due to the fact that some indication about the fluctuations of causal effects was assessed, it was concluded that further studies are needed for verifying the fundamental and likely hidden causal effects between the constructs of interest. Therefore, it was also recommended that complementary modelling and data processing measures be conducted for elucidating whether the mediation effects occur between IC, KM and OP, the verification of which requires further investigations of measured items and can be build on the findings of this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the vision of Mark Weiser on ubiquitous computing, computers are disappearing from the focus of the users and are seamlessly interacting with other computers and users in order to provide information and services. This shift of computers away from direct computer interaction requires another way of applications to interact without bothering the user. Context is the information which can be used to characterize the situation of persons, locations, or other objects relevant for the applications. Context-aware applications are capable of monitoring and exploiting knowledge about external operating conditions. These applications can adapt their behaviour based on the retrieved information and thus to replace (at least a certain amount) the missing user interactions. Context awareness can be assumed to be an important ingredient for applications in ubiquitous computing environments. However, context management in ubiquitous computing environments must reflect the specific characteristics of these environments, for example distribution, mobility, resource-constrained devices, and heterogeneity of context sources. Modern mobile devices are equipped with fast processors, sufficient memory, and with several sensors, like Global Positioning System (GPS) sensor, light sensor, or accelerometer. Since many applications in ubiquitous computing environments can exploit context information for enhancing their service to the user, these devices are highly useful for context-aware applications in ubiquitous computing environments. Additionally, context reasoners and external context providers can be incorporated. It is possible that several context sensors, reasoners and context providers offer the same type of information. However, the information providers can differ in quality levels (e.g. accuracy), representations (e.g. position represented in coordinates and as an address) of the offered information, and costs (like battery consumption) for providing the information. In order to simplify the development of context-aware applications, the developers should be able to transparently access context information without bothering with underlying context accessing techniques and distribution aspects. They should rather be able to express which kind of information they require, which quality criteria this information should fulfil, and how much the provision of this information should cost (not only monetary cost but also energy or performance usage). For this purpose, application developers as well as developers of context providers need a common language and vocabulary to specify which information they require respectively they provide. These descriptions respectively criteria have to be matched. For a matching of these descriptions, it is likely that a transformation of the provided information is needed to fulfil the criteria of the context-aware application. As it is possible that more than one provider fulfils the criteria, a selection process is required. In this process the system has to trade off the provided quality of context and required costs of the context provider against the quality of context requested by the context consumer. This selection allows to turn on context sources only if required. Explicitly selecting context services and thereby dynamically activating and deactivating the local context provider has the advantage that also the resource consumption is reduced as especially unused context sensors are deactivated. One promising solution is a middleware providing appropriate support in consideration of the principles of service-oriented computing like loose coupling, abstraction, reusability, or discoverability of context providers. This allows us to abstract context sensors, context reasoners and also external context providers as context services. In this thesis we present our solution consisting of a context model and ontology, a context offer and query language, a comprehensive matching and mediation process and a selection service. Especially the matching and mediation process and the selection service differ from the existing works. The matching and mediation process allows an autonomous establishment of mediation processes in order to transfer information from an offered representation into a requested representation. In difference to other approaches, the selection service selects not only a service for a service request, it rather selects a set of services in order to fulfil all requests which also facilitates the sharing of services. The approach is extensively reviewed regarding the different requirements and a set of demonstrators shows its usability in real-world scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eddy current testing by current deflection detects surface cracks and geometric features by sensing the re-routing of currents. Currents are diverted by cracks in two ways: down the walls, and along their length at the surface. Current deflection utilises the latter currents, detecting them via their tangential magnetic field. Results from 3-D finite element computer modelling, which show the two forms of deflection, are presented. Further results indicate that the current deflection technique is suitable for the detection of surface cracks in smooth materials with varying material properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ZusammenfassungIn dieser Arbeit konnte gezeigt werden, dass neben dem Oxytocinrezeptor auch die anderen Rezeptoren der Familie der Neurohypophysenhormone, die Vasopressinrezeptoren, in der gleichen Weise in ihren Bindungseigenschaften von Cholesterin beeinflusst werden. Im Gegensatz dazu zeigt der Cholecystokininrezeptor Typ B keine direkte Wechselwirkung mit Cholesterin. Durch Austausch der Transmembranhelices 6 und 7 des Oxytocinrezeptors mit entsprechenden Bereichen des Cholecystokininrezeptors wurde ein Rezeptor erzeugt, der bezüglich Bindungsverhalten und Cholesterinabhängigkeit keine Unterschiede zu dem Wildtyp-Oxytocinrezeptor zeigte. Durch den Einsatz von computergestütztem 'Modeling' wurde für die Interaktion des Oxytocinrezeptors mit Cholesterin eine Stelle zwischen den Transmembranhelices 5 und 6 vorgeschlagen. Um die Verteilung des Cholesterins in der Zelle zu untersuchen, wurde ein selbst synthetisiertes, fluoreszierendes Cholesterinderivat (Fluochol) eingesetzt. Die Komplexierung in Cyclodextrinen ermöglichte die Einlagerung von Fluochol in die Plasmamembran von Zellen. Der Einstrom des Fluochol in das ER erfolgte innerhalb von Minuten und war energieunabhängig. Schließlich wurde Fluochol in Lipidtröpfchen transportiert, die in fast allen Zellen für die Speicherung überschüssiger intrazellulärer Lipide dienen. Die Tröpfchen werden aus dem endoplasmatischen Retikulum gebildet und enthalten neben Phospholipiden auch Cholesterin, das durch das Enzym ACAT mit langkettigen Fettsäuren verestert wird.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of the parallel vector implementation of the one- and two-dimensional orthogonal transforms is evaluated. The orthogonal transforms are computed using actual or modified fast Fourier transform (FFT) kernels. The factors considered in comparing the speed-up of these vectorized digital signal processing algorithms are discussed and it is shown that the traditional way of comparing th execution speed of digital signal processing algorithms by the ratios of the number of multiplications and additions is no longer effective for vector implementation; the structure of the algorithm must also be considered as a factor when comparing the execution speed of vectorized digital signal processing algorithms. Simulation results on the Cray X/MP with the following orthogonal transforms are presented: discrete Fourier transform (DFT), discrete cosine transform (DCT), discrete sine transform (DST), discrete Hartley transform (DHT), discrete Walsh transform (DWHT), and discrete Hadamard transform (DHDT). A comparison between the DHT and the fast Hartley transform is also included.(34 refs)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During decades Distance Transforms have proven to be useful for many image processing applications, and more recently, they have started to be used in computer graphics environments. The goal of this paper is to propose a new technique based on Distance Transforms for detecting mesh elements which are close to the objects' external contour (from a given point of view), and using this information for weighting the approximation error which will be tolerated during the mesh simplification process. The obtained results are evaluated in two ways: visually and using an objective metric that measures the geometrical difference between two polygonal meshes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small and medium enterprises (SMEs) engaged in sugar processing in Myanmar appeared in the last decade of the socialist era. An acute sugar deficit, restricted trade in white sugar, and high demand from the conventional dairy business led to the growth of sugar SMEs by appropriate blending of semi-finished products (syrup) in the fields, which were then processed in vacuum pans and centrifugals to obtain white sugar. This became a tradable commodity and sugar SMEs grew in clusters in big cities. They are family-owned businesses. However, they lack the bagasse-based power generation. In recent years, large modern sugar factories operated by private and military companies have emerged as key players. The current shortage of fuel feedstock and competition for raw materials have become driving forces that shift sugar SMEs from market-oriented to raw material-oriented locations. Internal competition among key players made sugar price highly volatile, too. Being placed on a level playing field, the whole industry should be upgraded in terms of price and quality to become export-oriented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (M.S.)--University of Illinois.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"This report reproduces a thesis of the same title submitted to the Alfred P. Sloan School of Management, Massachusetts Institute of Technology, in partial fulfillment of the requirements for the degree of Doctor of Philosophy, June 1969."--p. 2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliographical references (p. 48-49).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Contributed to the Federal Information Processing Standards Task Group 15 - Computer Systems Security" -t.p.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finding motifs that can elucidate rules that govern peptide binding to medically important receptors is important for screening targets for drugs and vaccines. This paper focuses on elucidation of peptide binding to I-A(g7) molecule of the non-obese diabetic (NOD) mouse - an animal model for insulin-dependent diabetes mellitus (IDDM). A number of proposed motifs that describe peptide binding to I-A(g7) have been proposed. These motifs results from independent experimental studies carried out on small data sets. Testing with multiple data sets showed that each of the motifs at best describes only a subset of the solution space, and these motifs therefore lack generalization ability. This study focuses on seeking a motif with higher generalization ability so that it can predict binders in all A(g7) data sets with high accuracy. A binding score matrix representing peptide binding motif to A(g7) was derived using genetic algorithm (GA). The evolved score matrix significantly outperformed previously reported

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer modelling promises to. be an important tool for analysing and predicting interactions between trees within mixed species forest plantations. This study explored the use of an individual-based mechanistic model as a predictive tool for designing mixed species plantations of Australian tropical trees. The 'spatially explicit individually based-forest simulator' (SeXI-FS) modelling system was used to describe the spatial interaction of individual tree crowns within a binary mixed-species experiment. The three-dimensional model was developed and verified with field data from three forest tree species grown in tropical Australia. The model predicted the interactions within monocultures and binary mixtures of Flindersia brayleyana, Eucalyptus pellita and Elaeocarpus grandis, accounting for an average of 42% of the growth variation exhibited by species in different treatments. The model requires only structural dimensions and shade tolerance as species parameters. By modelling interactions in existing tree mixtures, the model predicted both increases and reductions in the growth of mixtures (up to +/- 50% of stem volume at 7 years) compared to monocultures. This modelling approach may be useful for designing mixed tree plantations. (c) 2006 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The following thesis describes the computer modelling of radio frequency capacitively coupled methane/hydrogen plasmas and the consequences for the reactive ion etching of (100) GaAs surfaces. In addition a range of etching experiments was undertaken over a matrix of pressure, power and methane concentration. The resulting surfaces were investigated using X-ray photoelectron spectroscopy and the results were discussed in terms of physical and chemical models of particle/surface interactions in addition to the predictions for energies, angles and relative fluxes to the substrate of the various plasma species. The model consisted of a Monte Carlo code which followed electrons and ions through the plasma and sheath potentials whilst taking account of collisions with background neutral gas molecules. The ionisation profile output from the electron module was used as input for the ionic module. Momentum scattering interactions of ions with gas molecules were investigated via different models and compared against results given by quantum mechanical code. The interactions were treated as central potential scattering events and the resulting neutral cascades were followed. The resulting predictions for ion energies at the cathode compared well to experimental ion energy distributions and this verified the particular form of the electrical potentials used and their applicability in the particular geometry plasma cell used in the etching experiments. The final code was used to investigate the effect of external plasma parameters on the mass distribution, energy and angles of all species impingent on the electrodes. Comparisons of electron energies in the plasma also agreed favourably with measurements made using a Langmuir electric probe. The surface analysis showed the surfaces all to be depleted in arsenic due to its preferential removal and the resultant Ga:As ratio in the surface was found to be directly linked to the etch rate. The etch rate was determined by the methane flux which was predicted by the code.