820 resultados para Wheeled tool carrier


Relevância:

20.00% 20.00%

Publicador:

Resumo:

STUDY DESIGN: Concurrent validity between postural indices obtained from digital photographs (two-dimensional [2D]), surface topography imaging (three-dimensional [3D]), and radiographs. OBJECTIVE: To assess the validity of a quantitative clinical postural assessment tool of the trunk based on photographs (2D) as compared to a surface topography system (3D) as well as indices calculated from radiographs. SUMMARY OF BACKGROUND DATA: To monitor progression of scoliosis or change in posture over time in young persons with idiopathic scoliosis (IS), noninvasive and nonionizing methods are recommended. In a clinical setting, posture can be quite easily assessed by calculating key postural indices from photographs. METHODS: Quantitative postural indices of 70 subjects aged 10 to 20 years old with IS (Cobb angle, 15 degrees -60 degrees) were measured from photographs and from 3D trunk surface images taken in the standing position. Shoulder, scapula, trunk list, pelvis, scoliosis, and waist angles indices were calculated with specially designed software. Frontal and sagittal Cobb angles and trunk list were also calculated on radiographs. The Pearson correlation coefficients (r) was used to estimate concurrent validity of the 2D clinical postural tool of the trunk with indices extracted from the 3D system and with those obtained from radiographs. RESULTS: The correlation between 2D and 3D indices was good to excellent for shoulder, pelvis, trunk list, and thoracic scoliosis (0.81>r<0.97; P<0.01) but fair to moderate for thoracic kyphosis, lumbar lordosis, and thoracolumbar or lumbar scoliosis (0.30>r<0.56; P<0.05). The correlation between 2D and radiograph spinal indices was fair to good (-0.33 to -0.80 with Cobb angles and 0.76 for trunk list; P<0.05). CONCLUSION: This tool will facilitate clinical practice by monitoring trunk posture among persons with IS. Further, it may contribute to a reduction in the use of radiographs to monitor scoliosis progression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optical and carrier transport properties of amorphous transparent zinc indium tin oxide (ZITO)(a-ZITO) thin films and the characteristics of the thin-film transistors TFTs were examined as a function of chemical composition. The as-deposited films were very conductive and showed clear free carrier absorption FCA . The analysis of the FCA gave the effective mass value of 0.53 me and a momentum relaxation time of 3.9 fs for an a-ZITO film with Zn:In:Sn = 0.35:0.35:0.3. TFTs with the as-deposited channels did not show current modulation due to the high carrier density in the channels. Thermal annealing at 300°C decreased the carrier density and TFTs fabricated with the annealed channels operated with positive threshold voltages VT when Zn contents were 25 atom % or larger. VT shifted to larger negative values, and subthreshold voltage swing increased with decreasing the Zn content, while large on–off current ratios 107–108 were kept for all the Zn contents. The field effect mobilities ranged from 12.4 to 3.4 cm2 V−1 s−1 for the TFTs with Zn contents varying from 5 to 48 atom %. The role of Zn content is also discussed in relation to the carrier transport properties and amorphous structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Department of Instrumentation, Cochin University of Science and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, the applications of the recurrence quantification analysis in metal cutting operation in a lathe, with specific objective to detect tool wear and chatter, are presented.This study is based on the discovery that process dynamics in a lathe is low dimensional chaotic. It implies that the machine dynamics is controllable using principles of chaos theory. This understanding is to revolutionize the feature extraction methodologies used in condition monitoring systems as conventional linear methods or models are incapable of capturing the critical and strange behaviors associated with the metal cutting process.As sensor based approaches provide an automated and cost effective way to monitor and control, an efficient feature extraction methodology based on nonlinear time series analysis is much more demanding. The task here is more complex when the information has to be deduced solely from sensor signals since traditional methods do not address the issue of how to treat noise present in real-world processes and its non-stationarity. In an effort to get over these two issues to the maximum possible, this thesis adopts the recurrence quantification analysis methodology in the study since this feature extraction technique is found to be robust against noise and stationarity in the signals.The work consists of two different sets of experiments in a lathe; set-I and set-2. The experiment, set-I, study the influence of tool wear on the RQA variables whereas the set-2 is carried out to identify the sensitive RQA variables to machine tool chatter followed by its validation in actual cutting. To obtain the bounds of the spectrum of the significant RQA variable values, in set-i, a fresh tool and a worn tool are used for cutting. The first part of the set-2 experiments uses a stepped shaft in order to create chatter at a known location. And the second part uses a conical section having a uniform taper along the axis for creating chatter to onset at some distance from the smaller end by gradually increasing the depth of cut while keeping the spindle speed and feed rate constant.The study concludes by revealing the dependence of certain RQA variables; percent determinism, percent recurrence and entropy, to tool wear and chatter unambiguously. The performances of the results establish this methodology to be viable for detection of tool wear and chatter in metal cutting operation in a lathe. The key reason is that the dynamics of the system under study have been nonlinear and the recurrence quantification analysis can characterize them adequately.This work establishes that principles and practice of machining can be considerably benefited and advanced from using nonlinear dynamics and chaos theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plasma polymerization is found to be an excellent technique for the preparation of good quality, pinhole-free, polymer thin films from different monomer precursors. The present work describes the preparation and characterization of polypyrrole (PPy) thin films by ac plasma polymerization technique in their pristine and in situ iodine doped forms. The electrical conductivity studies of the aluminiumpolymeraluminium (AlpolymerAl) structures have been carried out and a space charge limited conduction (SCLC) mechanism is identified as the most probable mechanism of carrier transport in these polymer films. The electrical conductivity shows an enhanced value in the iodine doped sample. The reduction of optical band gap by iodine doping is correlated with the observed conductivity results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning Disability (LD) is a neurological condition that affects a child’s brain and impairs his ability to carry out one or many specific tasks. LD affects about 15 % of children enrolled in schools. The prediction of LD is a vital and intricate job. The aim of this paper is to design an effective and powerful tool, using the two intelligent methods viz., Artificial Neural Network and Adaptive Neuro-Fuzzy Inference System, for measuring the percentage of LD that affected in school-age children. In this study, we are proposing some soft computing methods in data preprocessing for improving the accuracy of the tool as well as the classifier. The data preprocessing is performed through Principal Component Analysis for attribute reduction and closest fit algorithm is used for imputing missing values. The main idea in developing the LD prediction tool is not only to predict the LD present in children but also to measure its percentage along with its class like low or minor or major. The system is implemented in Mathworks Software MatLab 7.10. The results obtained from this study have illustrated that the designed prediction system or tool is capable of measuring the LD effectively

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Polystyrene beads, impregnated with mineral salts/glutamine medium as inert support, were used to produce L-glutaminase from Vibrio costicola by solid-state fermentation. Maximum enzyme yield, 88 U/g substrate, was after 36 h. Glucose at 10 g/kg enhanced the enzyme yield by 66%. The support system allowed glutaminase to be recovered with higher specific activity and lower viscosity than when a wheat-bran system was used

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tourism is an industry which is heavily dependent on marketing. Mouth to mouth communication has played a major role in shaping a number of destinations.This is particularly true in modern parlance.This is social networking phenomenon which is fast spreading over the internet .Many sites provide visitors a lot of freedom to express their views.Promotion of a destination depends lot on conversation and exchange of information over these social networks.This paper analyses the social networking sites their contribution to marketing tourism and hoapitality .The negetive impacts phenomena are also discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The carrier transport mechanism of polyaniline (PA) thin films prepared by radio frequency plasma polymerization is described in this paper. The mechanism of electrical conduction and carrier mobility of PA thin films for different temperatures were examined using the aluminium–PA–aluminium (Al–PA–Al) structure. It is found that the mechanism of carrier transport in these thin films is space charge limited conduction. J –V studies on an asymmetric electrode configuration using indium tin oxide (ITO) as the base electrode and Al as the upper electrode (ITO–PA–Al structure) show a diode-like behaviour with a considerable rectification ratio

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need to structure knowledge is as important now as it ever has been. This paper has tried to study the ISP knowledge portal to explore how knowledge on various resources and topics in photonics and related areas are organized in the knowledge portal of International School of Photonics, CUSAT. The study revealed that ISP knowledge portal is one of the best portals in the filed. It provides a model for building an effective knowledge portal in other fields

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a simple and efficient method and its implementation in an FPGA for reducing the odometric localization errors caused by over count readings of an optical encoder based odometric system in a mobile robot due to wheel-slippage and terrain irregularities. The detection and correction is based on redundant encoder measurements. The method suggested relies on the fact that the wheel slippage or terrain irregularities cause more count readings from the encoder than what corresponds to the actual distance travelled by the vehicle. The standard quadrature technique is used to obtain four counts in each encoder period. In this work a three-wheeled mobile robot vehicle with one driving-steering wheel and two-fixed rear wheels in-axis, fitted with incremental optical encoders is considered. The CORDIC algorithm has been used for the computation of sine and cosine terms in the update equations. The results presented demonstrate the effectiveness of the technique

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solid waste management nowadays is an important environmental issue in country like India. Statistics show that there has been substantial increase in the solid waste generation especially in the urban areas. This trend can be ascribed to rapid population growth, changing lifestyles, food habits, and change in living standards, lack of financial resources, institutional weaknesses, improper choice of technology and public apathy towards municipal solid waste. Waste is directly related to the consumption of resources and dumping to the land. Ecological footprint analysis – an impact assessment environment management tool makes a relationship between two factors- the amount of land required to dispose per capita generated waste. Ecological footprint analysis is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis with special emphasis to waste generation. The paper also discusses and analyses the waste footprint of Kochi city,India. An attempt is also made to suggest strategies to reduce the waste footprint thereby making the city sustainable, greener and cleaner

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kochi, the commercial capital of Kerala and the second most important city next to Mumbai on the Western coast of India, is a land having a wide variety of residential environments. The present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. This trend can be ascribed to rapid population growth, our changing lifestyles, food habits, and change in living standards, institutional weaknesses, improper choice of technology and public apathy. Ecological footprint analysis (EFA) is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. This paper analyses the scope of EFA as a sustainable environmental management tool for Kochi City

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past, natural resources were plentiful and people were scarce. But the situation is rapidly reversing. Our challenge is to find a way to balance human consumption and nature’s limited productivity in order to ensure that our communities are sustainable locally, regionally and globally. Kochi, the commercial capital of Kerala, South India and the second most important city next to Mumbai on the Western coast is a land having a wide variety of residential environments. Due to rapid population growth, changing lifestyles, food habits and living standards, institutional weaknesses, improper choice of technology and public apathy, the present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. Ecological Footprint Analysis (EFA) is physical accounting method, developed by William Rees and M. Wackernagel, focusing on land appropriation using land as its “currency”. It provides a means for measuring and communicating human induced environmental impacts upon the planet. The aim of applying EFA to Kochi city is to quantify the consumption and waste generation of a population and to compare it with the existing biocapacity. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis and calculate and analyse the ecological footprint of the residential areas of Kochi city. The paper also discusses and analyses the waste footprint of the city. An attempt is also made to suggest strategies to reduce the footprint thereby making the city sustainable