904 resultados para Policy Design, Analysis, and Evaluation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identification and Control of Non‐linear dynamical systems are challenging problems to the control engineers.The topic is equally relevant in communication,weather prediction ,bio medical systems and even in social systems,where nonlinearity is an integral part of the system behavior.Most of the real world systems are nonlinear in nature and wide applications are there for nonlinear system identification/modeling.The basic approach in analyzing the nonlinear systems is to build a model from known behavior manifest in the form of system output.The problem of modeling boils down to computing a suitably parameterized model,representing the process.The parameters of the model are adjusted to optimize a performanace function,based on error between the given process output and identified process/model output.While the linear system identification is well established with many classical approaches,most of those methods cannot be directly applied for nonlinear system identification.The problem becomes more complex if the system is completely unknown but only the output time series is available.Blind recognition problem is the direct consequence of such a situation.The thesis concentrates on such problems.Capability of Artificial Neural Networks to approximate many nonlinear input-output maps makes it predominantly suitable for building a function for the identification of nonlinear systems,where only the time series is available.The literature is rich with a variety of algorithms to train the Neural Network model.A comprehensive study of the computation of the model parameters,using the different algorithms and the comparison among them to choose the best technique is still a demanding requirement from practical system designers,which is not available in a concise form in the literature.The thesis is thus an attempt to develop and evaluate some of the well known algorithms and propose some new techniques,in the context of Blind recognition of nonlinear systems.It also attempts to establish the relative merits and demerits of the different approaches.comprehensiveness is achieved in utilizing the benefits of well known evaluation techniques from statistics. The study concludes by providing the results of implementation of the currently available and modified versions and newly introduced techniques for nonlinear blind system modeling followed by a comparison of their performance.It is expected that,such comprehensive study and the comparison process can be of great relevance in many fields including chemical,electrical,biological,financial and weather data analysis.Further the results reported would be of immense help for practical system designers and analysts in selecting the most appropriate method based on the goodness of the model for the particular context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis entitled Physicochemical and molecular characterization of bacteriophages ΦSP-1and ΦSP-3, specific for pathogenic Salmonella and evaluation of their potential as biocontrol agent . Salmonella were screened using standard methodologies from various environmental samples including chicken caecum. Salmonella strains, which were previously isolated and stocked in the lab, were also included in this study as host, for screening Salmonella specific lytic phages. The Salmonella strain in this study designated as S49 which helped in phage propagation by acting as host bacteria was identified as Salmonella enterica subsp. enterica by 16S rRNA gene analysis and serotyping . A total of three Salmonella specific phage named as ΦSP-1, ΦSP-2 and ΦSP-3 were isolated from chicken intestine samples via an enrichment protocol employing the double agar overlay method. ΦSP-1 and ΦSP-3 showing consistent lytic nature were selected for further study and were purified by repeated plating after picking of single isolated plaques from the lawns of Salmonella S49 plates. Both the phages produced small, clear plaques indicating their lytic nature. ΦSP-1 and ΦSP-3 were concentrated employing PEG-NaCl precipitation method before further characterization. The focus of present study was to isolate, characterize and verify the efficacy of lytic bacteriophages against the robust pathogen Salmonella, capable of surviving under various hostile conditions. Two phages, ΦSP-1 and ΦSP-3, belonging to two families, Podovoridae and Siphoviridae were isolated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The chemical composition and evaluation of Indian squid (Loligo duvauceli) mantle, epidermal connective tissue and tentacle is investigated in this current study. It is observed that squid mantle contains 22.2% total protein; 63.5% of the total protein is myofibrillar protein. The unique property of squid myofibrillar protein is its water solubility. Squid mantle contains 12.0% total collagen. Epidermal connective tissue has highest amounts of total collagen (17.8%). SDS-PAGE of total collagen identified high molecular weight α-, β- and γ- sub-chains. Amino acid profile analysis indicates that mantle and tentacle contain essential amino acids. Arginine forms a major portion of mantle collagen (272.5 g/100 g N). Isoleucine, glutamic acid and lysine are other amino acids that are found in significantly high amounts in the mantle. Sulphur containing cystine is deficit in mantle collagen. Papain digest of mantle and epidermal connective tissue is rich in uronic acid, while papain digest, collagenase digest and urea digest of epidermal connective tissue has significant amounts of sialic acid (25.2, 33.2 and 99.8 μmol /100 g, respectively). PAS staining of papain digest, collagenase digest and urea digest also identify the association of hexoses with low molecular weight collagen fragments. Histochemical sectioning also emphasized the localized distribution of collagen in epidermal and dermal region and very sparse fibres traverse the myotome bundles

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I present a novel design methodology for the synthesis of automatic controllers, together with a computational environment---the Control Engineer's Workbench---integrating a suite of programs that automatically analyze and design controllers for high-performance, global control of nonlinear systems. This work demonstrates that difficult control synthesis tasks can be automated, using programs that actively exploit and efficiently represent knowledge of nonlinear dynamics and phase space and effectively use the representation to guide and perform the control design. The Control Engineer's Workbench combines powerful numerical and symbolic computations with artificial intelligence reasoning techniques. As a demonstration, the Workbench automatically designed a high-quality maglev controller that outperforms a previous linear design by a factor of 20.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the concept of Value Stream Analysis and Mapping (VSA/M) as applied to Product Development (PD) efforts. Value Stream Analysis and Mapping is a method of business process improvement. The application of VSA/M began in the manufacturing community. PD efforts provide a different setting for the use of VSA/M. Site visits were made to nine major U.S. aerospace organizations. Interviews, discussions, and participatory events were used to gather data on (1) the sophistication of the tools used in PD process improvement efforts, (2) the lean context of the use of the tools, and (3) success of the efforts. It was found that all three factors were strongly correlated, suggesting success depends on both good tools and lean context. Finally, a general VSA/M method for PD activities is proposed. The method uses modified process mapping tools to analyze and improve process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a tool for the analysis and regeneration of Web contents, implemented through XML and Java. At the moment, the Web content delivery from server to clients is carried out without taking into account clients' characteristics. Heterogeneous and diverse characteristics, such as user's preferences, different capacities of the client's devices, different types of access, state of the network and current load on the server, directly affect the behavior of Web services. On the other hand, the growing use of multimedia objects in the design of Web contents is made without taking into account this diversity and heterogeneity. It affects, even more, the appropriate content delivery. Thus, the objective of the presented tool is the treatment of Web pages taking into account the mentioned heterogeneity and adapting contents in order to improve the performance on the Web

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Back injuries identification and diagnoses in the transition of the Taylor model to the flexiblemodel of production organization, demands a parallel intervention of prevention actors at work. This study uses simultaneously three intervention models (structured action analysis, muscle skeletal symptoms questionnaires and muscle skeletal assessment) for work activities in a packaging plant. In this study seventy and two (72) operative workers participated (28 workers with muscle skeletal evaluation). In an intervention period of 10 months, the physical, cognitive, organizational components and productive process dynamics were evaluated from the muscle skeletal demands issues. The differences established between objective exposure at risk, back injury risk perception, appreciation and a vertebral spine evaluation, in prior and post intervention, determines the structure for a muscle skeletal risk management system. This study explains that back injury symptoms can be more efficiently reduced among operative workers combining measures registered and the adjustment between dynamics, the changes at work and efficient gestures development. Relevance: the results of this study can be used to pre ent back injuries in workers of flexible production processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to introduce a diVerent approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. Design—The approach is illustrated by providing quantitative estimates of the short-termeVects of air pollution on mortality in three Spanish cities, Barcelona,Valencia and Vigo, for the period 1992–1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were nonlinear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. Main results—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 μg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 μg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant,especially those among air pollutants and meteorological variables. Conclusions—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares two language evaluation tests--Development Sentence Analysis and the CID Grammatical Analysis of Elicited Language: Simple Sentence Level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conservation of crop wild relatives (CWRs) is a complex interdisciplinary process that is being addressed by various national and international initiatives, including two Global Environment Facility projects ('In situ Conservation of Crop Wild Relatives through Enhanced Information Management and Field Application' and 'Design, Testing and Evaluation of Best Practices for in situ Conservation of Economically Important Wild Species'), the European Community-funded project 'European Crop Wild Relative Diversity Assessment and Conservation Forum (PGR Forum)' and the European 'In situ and On Farm Network'. The key issues that have arisen are: (1) the definition of what constitutes a CWR, (2) the need for national and regional information systems and a global system, (3) development and application of priority-determining mechanisms, (4) the incorporation of the conservation of CWRs into existing national, regional and international PGR programmes, (5) assessment of the effectiveness of conservation actions, (6) awareness of the importance of CWRs in agricultural development at local, national and international levels both for the scientific and lay communities and (7) policy development and legal framework. The above issues are illustrated by work on the conservation of a group of legumes known as grasspea chicklings, vetchlings, and horticultural ornamental peas (Lathyrus spp.) in their European and Mediterranean centre of diversity. (c) 2007 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Novel 'tweezer-type' complexes that exploit the interactions between pi-electron-rich pyrenyl groups and pi-electron deficient diimide units have been designed and synthesised. The component molecules leading to complex formation were accessed readily from commercially available starting materials through short and efficient syntheses. Analysis of the resulting complexes, using the visible charge-transfer band, revealed association constants that increased sequentially from 130 to 11,000 M-1 as increasing numbers of pi-pi-stacking interactions were introduced into the systems. Computational modelling was used to analyse the structures of these complexes, revealing low-energy chain-folded conformations for both components, which readily allow close, multiple pi-pi-stacking and hydrogen bonding to be achieved. In this paper, we give details of our initial studies of these complexes and outline how their behaviour could provide a basis for designing self-healing polymer blends for use in adaptive coating systems. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper summarizes the design, manufacturing, testing, and finite element analysis (FEA) of glass-fibre-reinforced polyester leaf springs for rail freight vehicles. FEA predictions of load-deflection curves under static loading are presented, together with comparisons with test results. Bending stress distribution at typical load conditions is plotted for the springs. The springs have been mounted on a real wagon and drop tests at tare and full load have been carried out on a purpose-built shaker rig. The transient response of the springs from tests and FEA is presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel design of a virtual dental training system (hapTEL) using haptic technology. The system allows dental students to learn and practice procedures such as dental drilling, caries removal and cavity preparation for tooth restoration. This paper focuses on the hardware design, development and evaluation aspects in relation to the dental training and educational requirements. Detailed discussions on how the system offers dental students a natural operational position are documented. An innovative design of measuring and connecting the dental tools to the haptic device is also shown. Evaluation of the impact on teaching and learning is discussed.