913 resultados para Classificació AMS::65 Numerical analysis::65D Numerical approximation and computational geometry
Resumo:
The present study is intended to provide a new scientific approach to the solution of the worlds cost engineering problems encountered in the chemical industries in our nation. The problem is that of cost estimation of equipments especially of pressure vessels when setting up chemical industries .The present study attempts to develop a model for such cost estimation. This in turn is hoped would go a long way to solve this and related problems in forecasting the cost of setting up chemical plants.
Resumo:
This thesis analyses certain problems in Inventories and Queues. There are many situations in real-life where we encounter models as described in this thesis. It analyses in depth various models which can be applied to production, storag¢, telephone traffic, road traffic, economics, business administration, serving of customers, operations of particle counters and others. Certain models described here is not a complete representation of the true situation in all its complexity, but a simplified version amenable to analysis. While discussing the models, we show how a dependence structure can be suitably introduced in some problems of Inventories and Queues. Continuous review, single commodity inventory systems with Markov dependence structure introduced in the demand quantities, replenishment quantities and reordering levels are considered separately. Lead time is assumed to be zero in these models. An inventory model involving random lead time is also considered (Chapter-4). Further finite capacity single server queueing systems with single/bulk arrival, single/bulk services are also discussed. In some models the server is assumed to go on vacation (Chapters 7 and 8). In chapters 5 and 6 a sort of dependence is introduced in the service pattern in some queuing models.
Resumo:
In this thesis we attempt to make a probabilistic analysis of some physically realizable, though complex, storage and queueing models. It is essentially a mathematical study of the stochastic processes underlying these models. Our aim is to have an improved understanding of the behaviour of such models, that may widen their applicability. Different inventory systems with randon1 lead times, vacation to the server, bulk demands, varying ordering levels, etc. are considered. Also we study some finite and infinite capacity queueing systems with bulk service and vacation to the server and obtain the transient solution in certain cases. Each chapter in the thesis is provided with self introduction and some important references
Resumo:
The primary focus of this study was to asses the impact of selected antecedent variables namely Psychological Empowerment at Work (PEW), Psychological Contract Violation (PCV), Work Life Balance (WLB), Job Satisfaction (JS) and Affective Organisational Commitment (AOC) on Managerial Performance (MP) of middle level managers of private sector manufacturing and service sector organisations in Kerala.The study brings out the significance of Job Attitudes namely Job Satisfaction and Affective Organisational Commitment in meaningfully explaining the linkage between the rest of the antecedent variables in the study and Managerial Performance. The study interestingly revealed that Job Attitudes play a mediating role in explaining performance of managers unlike visualised in the initial conceptual framework. The study points to the importance of taking care of job attitudes in the work place to ensure performance of managers. The result of the study also brings out the significance of maintaining work-life balance especially in service sector organisations because it will have a direct impact on the level of performance of managers than most of the other contextual factors. Hence, it is the responsibility of HR department to initiate activities which are customised to the collective aspirations of the members of respective organisations to ensure positive job attitudes. HR departments should advice and convince the top management to provide resource support and endorsement to such initiatives.
Resumo:
In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.
Resumo:
We take stock of the present position of compositional data analysis, of what has been achieved in the last 20 years, and then make suggestions as to what may be sensible avenues of future research. We take an uncompromisingly applied mathematical view, that the challenge of solving practical problems should motivate our theoretical research; and that any new theory should be thoroughly investigated to see if it may provide answers to previously abandoned practical considerations. Indeed a main theme of this lecture will be to demonstrate this applied mathematical approach by a number of challenging examples
Resumo:
In any discipline, where uncertainty and variability are present, it is important to have principles which are accepted as inviolate and which should therefore drive statistical modelling, statistical analysis of data and any inferences from such an analysis. Despite the fact that two such principles have existed over the last two decades and from these a sensible, meaningful methodology has been developed for the statistical analysis of compositional data, the application of inappropriate and/or meaningless methods persists in many areas of application. This paper identifies at least ten common fallacies and confusions in compositional data analysis with illustrative examples and provides readers with necessary, and hopefully sufficient, arguments to persuade the culprits why and how they should amend their ways
Resumo:
Resumen basado en el del autor
Resumo:
The high level of realism and interaction in many computer graphic applications requires techniques for processing complex geometric models. First, we present a method that provides an accurate low-resolution approximation from a multi-chart textured model that guarantees geometric fidelity and correct preservation of the appearance attributes. Then, we introduce a mesh structure called Compact Model that approximates dense triangular meshes while preserving sharp features, allowing adaptive reconstructions and supporting textured models. Next, we design a new space deformation technique called *Cages based on a multi-level system of cages that preserves the smoothness of the mesh between neighbouring cages and is extremely versatile, allowing the use of heterogeneous sets of coordinates and different levels of deformation. Finally, we propose a hybrid method that allows to apply any deformation technique on large models obtaining high quality results with a reduced memory footprint and a high performance.
Resumo:
Interactions between electrons determine the structure and properties of matter from molecules to solids. Therefore, the understanding of the electronic structure of molecules will enable us to extract relevant chemical information. In the first part of this thesis, we focus our attention on the analysis of chemical bonding by means of the Electron Localization Function (ELF) and the Domain-Averaged Fermi Hole analysis (DAFH). In the second part, we assess the performance of some indicators of aromaticity by analyzing their advantages and drawbacks. We propose a series of tests based on well-known aromaticity trends that can be applied to evaluate the aromaticity of current and future indicators of aromaticity in both organic and inorganic species. Moreover, we investigate the nature of electron delocalization in both aromatic and antiaromatic systems in the light of Hückel’s (4n + 2) rule. Finally, we analyze the phenomenon of multiple aromaticity in all-metal clusters.
Resumo:
The interannual variability of the hydrological cycle is diagnosed from the Hadley Centre and Geophysical Fluid Dynamics Laboratory (GFDL) climate models, both of which are forced by observed sea surface temperatures. The models produce a similar sensitivity of clear-sky outgoing longwave radiation to surface temperature of ∼2 W m−2 K−1, indicating a consistent and positive clear-sky radiative feedback. However, differences between changes in the temperature lapse-rate and the height dependence of moisture fluctuations suggest that contrasting mechanisms bring about this result. The GFDL model appears to give a weaker water vapor feedback (i.e., changes in specific humidity). This is counteracted by a smaller upper tropospheric temperature response to surface warming, which implies a compensating positive lapse-rate feedback.
Resumo:
Excavations on the multi-period settlement at Old Scatness, Shetland have uncovered a number of Iron Age structures with compacted, floor-like layers. Thin section analysis was undertaken in order to investigate and compare the characteristics of these layers. The investigation also draws on earlier analyses of the Iron Age agricultural soil around the settlement and the midden deposits that accumulated within the settlement, to create a 'joined-up' analysis which considers the way material from the settlement was used and then recycled as fertiliser for the fields. Peat was collected from the nearby uplands and was used for fuel and possibly also for flooring. It is suggested that organic-rich floors from the structures were periodically removed and the material was spread onto the fields as fertilisers. More organic-rich material may have been used selectively for fertiliser, while the less organic peat ash was allowed to accumulate in middens. Several of the structures may have functioned as byres, which suggests a prehistoric plaggen system.
Resumo:
The elucidation of spatial variation in the landscape can indicate potential wildlife habitats or breeding sites for vectors, such as ticks or mosquitoes, which cause a range of diseases. Information from remotely sensed data could aid the delineation of vegetation distribution on the ground in areas where local knowledge is limited. The data from digital images are often difficult to interpret because of pixel-to-pixel variation, that is, noise, and complex variation at more than one spatial scale. Landsat Thematic Mapper Plus (ETM+) and Satellite Pour l'Observation de La Terre (SPOT) image data were analyzed for an area close to Douna in Mali, West Africa. The variograms of the normalized difference vegetation index (NDVI) from both types of image data were nested. The parameters of the nested variogram function from the Landsat ETM+ data were used to design the sampling for a ground survey of soil and vegetation data. Variograms of the soil and vegetation data showed that their variation was anisotropic and their scales of variation were similar to those of NDVI from the SPOT data. The short- and long-range components of variation in the SPOT data were filtered out separately by factorial kriging. The map of the short-range component appears to represent the patterns of vegetation and associated shallow slopes and drainage channels of the tiger bush system. The map of the long-range component also appeared to relate to broader patterns in the tiger bush and to gentle undulations in the topography. The results suggest that the types of image data analyzed in this study could be used to identify areas with more moisture in semiarid regions that could support wildlife and also be potential vector breeding sites.