884 resultados para Machine of 360°
Resumo:
In an effort to develop a fully computerized approach for structural synthesis of kinematic chains the steps involved in the method of structural synthesis based on transformation of binary chains [38] have been recast in a format suitable for implementation on a digital computer. The methodology thus evolved has been combined with the algebraic procedures for structural analysis [44] to develop a unified computer program for structural synthesis and analysis of simple jointed kinematic chains with a degree of freedom 0. Applications of this program are presented in the succeeding parts of the paper.
Resumo:
The reliability of the computer program for structural synthesis and analysis of simple-jointed kinematic chains developed in Part 1 has been established by applying it to several cases for whuch solutions are either fully or partially available in the literature, such as 7-link, zero-freedom chains; 8- and 10-link, single-freedom chains; 12-link, single-freedom binary chains; and 9-link, two-freedom chains. In the process some discrepancies in the results reported in previous literature have been brought to light.
Resumo:
The unified computer program for structural synthesis and analysis developed in Part 1 has been employed to derive the new and complete collection of 97 10-link, three-freedom simple-jointed kinematic chains. The program shows that of these chains, 3 have total freedom, 70 have partial freedom and the remaining 24 have fractionated freedom and that the 97 chains yield a total of 676 distinct mechanisms.
Resumo:
Absenteeism is one of the major problems of Indian industries. It necessitates the employment of more manpower than the jobs require, resulting in the increase of manpower costs, and lowers the efficiency of plant operation through lowered performance and higher rejects. It also causes machine idleness, if extra manpower is not hired, resulting in disrupted work schedules and assignments. Several studies have investigated the causes of absenteeism (Vaid 1967) for example and their remedy and relationship between absenteeism and turnover with a suggested model for diagnosis and treatment (Hawk 1976) However, the production foremen and supervisor will face the operating task of determining how many extra operatives are to be hired in order to stave off the adverse effects of absenteeism on the man-machine system. This paper deals with a class of reserve manpower models based on the reject allowance model familiar in quality control literature. The present study considers, in addition to absenteeism, machine failures and the graded nature of manpower met within production systems and seeks to find optimal reserve manpower through computer simulation.
Resumo:
Content delivery networks (CDNs) are an essential component of modern website infrastructures: edge servers located closer to users cache content, increasing robustness and capacity while decreasing latency. However, this situation becomes complicated for HTTPS content that is to be delivered using the Transport Layer Security (TLS) protocol: the edge server must be able to carry out TLS handshakes for the cached domain. Most commercial CDNs require that the domain owner give their certificate's private key to the CDN's edge server or abandon caching of HTTPS content entirely. We examine the security and performance of a recently commercialized delegation technique in which the domain owner retains possession of their private key and splits the TLS state machine geographically with the edge server using a private key proxy service. This allows the domain owner to limit the amount of trust given to the edge server while maintaining the benefits of CDN caching. On the performance front, we find that latency is slightly worse compared to the insecure approach, but still significantly better than the domain owner serving the content directly. On the security front, we enumerate the security goals for TLS handshake proxying and identify a subtle difference between the security of RSA key transport and signed-Diffie--Hellman in TLS handshake proxying; we also discuss timing side channel resistance of the key server and the effect of TLS session resumption.
Resumo:
Physical and psychological decline is common in the post-treatment breast cancer population, yet the efficacy of concurrent interventions to meet both physical- psychosocial needs in this population has not been extensively examined. PURPOSE: This study explores the effects of a combined exercise and psychosocial intervention model on selected physiological-psychological parameters in post-treated breast cancer. METHODS: Forty-one breast cancer survivors were randomly assigned to one of four groups for an 8-week intervention: exercise only [EX, n=13] (aerobic and resistance training), psychosocial therapy only [PS, n=11] (biofeedback), combined EX and PS [EX+PS, n=11], or to control conditions [CO, n=6]. Mean delta score (post-intervention - baseline) were calculated for each of the following: body weight, % body fat (skin folds), predicted VO2max (Modified Bruce Protocol), overall dynamic muscular endurance [OME] (RMCRI protocol), static balance (Single leg stance test), dynamic balance (360° turn and 4-square step test), fatigue (Revised Piper Scale), and quality of life (FACT-B). A one-way ANOVA was used to analyze the preliminary results of this on-going randomized trial. RESULTS: Overall, there were significant differences in the delta scores for predicted VO2max, OME, and dynamic balance among the 4 groups (p<0.05). The EX+PS group showed a significant improvement in VO2max compared with the PS group (4.2 ± 3.8 vs. -0.9 ± 4.2 mL/kg/min; p<0.05). Both the EX+PS and EX groups showed significant improvements in OME compared with the PS and CO groups (44.5 ± 23.5 and 43.4 ± 22.1 vs. -3.9 ± 15.2 and 2.7 ± 13.7 repetitions; p<0.05). All 3 intervention groups showed significant improvements in dynamic balance compared with the CO group (-0.8 ± 0.6, -0.6 ± 0.8, and -0.6 ±1.0 vs. 0.6 ± 0.6 seconds; p<0.05). Overall, changes in fatigue tended towards significance among the 4 groups (p = 0.08), with decreased fatigue in the intervention groups and increased fatigue in the CO group. CONCLUSIONS: Our preliminary findings suggest that EX and PS seem to produce greater positive changes in the outcome measures than CO. However, at this point no definite conclusions can be made on the additive effects of combining the EX and PS interventions.
Resumo:
Objective Vast amounts of injury narratives are collected daily and are available electronically in real time and have great potential for use in injury surveillance and evaluation. Machine learning algorithms have been developed to assist in identifying cases and classifying mechanisms leading to injury in a much timelier manner than is possible when relying on manual coding of narratives. The aim of this paper is to describe the background, growth, value, challenges and future directions of machine learning as applied to injury surveillance. Methods This paper reviews key aspects of machine learning using injury narratives, providing a case study to demonstrate an application to an established human-machine learning approach. Results The range of applications and utility of narrative text has increased greatly with advancements in computing techniques over time. Practical and feasible methods exist for semi-automatic classification of injury narratives which are accurate, efficient and meaningful. The human-machine learning approach described in the case study achieved high sensitivity and positive predictive value and reduced the need for human coding to less than one-third of cases in one large occupational injury database. Conclusion The last 20 years have seen a dramatic change in the potential for technological advancements in injury surveillance. Machine learning of ‘big injury narrative data’ opens up many possibilities for expanded sources of data which can provide more comprehensive, ongoing and timely surveillance to inform future injury prevention policy and practice.
Resumo:
The test based on comparison of the characteristic coefficients of the adjancency matrices of the corresponding graphs for detection of isomorphism in kinematic chains has been shown to fail in the case of two pairs of ten-link, simple-jointed chains, one pair corresponding to single-freedom chains and the other pair corresponding to three-freedom chains. An assessment of the merits and demerits of available methods for detection of isomorphism in graphs and kinematic chains is presented, keeping in view the suitability of the methods for use in computerized structural synthesis of kinematic chains. A new test based on the characteristic coefficients of the “degree” matrix of the corresponding graph is proposed for detection of isomorphism in kinematic chains. The new test is found to be successful in the case of a number of examples of graphs where the test based on characteristic coefficients of adjancency matrix fails. It has also been found to be successful in distinguishing the structures of all known simple-jointed kinematic chains in the categories of (a) single-freedom chains with up to 10 links, (b) two-freedom chains with up to 9 links and (c) three-freedom chains with up to 10 links.
Resumo:
The finite element method (FEM) is used to determine for pitch-point, mid-point and tip loading, the deflection curve of a Image 1 diamentral pitch (DP) standard spur gear tooth corresponding to number of teeth of 14, 21, 26 and 34. In all these cases the deflection of the gear tooth at the point of loading obtained by FEM is in good agreement with the experimental value. The contraflexure in the deflection curve at the point of loading observed experimentally in the cases of pitch-point and mid-point loading, is predicted correctly by the FEM analysis.
Resumo:
The combination of dwindling petroleum reserves and population growth make the development of renewable energy and chemical resources more pressing than ever before. Plant biomass is the most abundant renewable source for energy and chemicals. Enzymes can selectively convert the polysaccharides in plant biomass into simple sugars which can then be upgraded to liquid fuels and platform chemicals using biological and/or chemical processes. Pretreatment is essential for efficient enzymatic saccharification of plant biomass and this article provides an overview of how organic solvent (organosolv) pretreatments affect the structure and chemistry of plant biomass, and how these changes enhance enzymatic saccharification. A comparison between organosolv pretreatments utilizing broadly different classes of solvents (i.e., low boiling point, high boiling point, and biphasic) is presented, with a focus on solvent recovery and formation of by-products. The reaction mechanisms that give rise to these by-products are investigated and strategies to minimize by-product formation are suggested. Finally, process simulations of organosolv pretreatments are compared and contrasted, and discussed in the context of an industrial-scale plant biomass to fermentable sugar process.
Resumo:
Objective Death certificates provide an invaluable source for cancer mortality statistics; however, this value can only be realised if accurate, quantitative data can be extracted from certificates – an aim hampered by both the volume and variable nature of certificates written in natural language. This paper proposes an automatic classification system for identifying cancer related causes of death from death certificates. Methods Detailed features, including terms, n-grams and SNOMED CT concepts were extracted from a collection of 447,336 death certificates. These features were used to train Support Vector Machine classifiers (one classifier for each cancer type). The classifiers were deployed in a cascaded architecture: the first level identified the presence of cancer (i.e., binary cancer/nocancer) and the second level identified the type of cancer (according to the ICD-10 classification system). A held-out test set was used to evaluate the effectiveness of the classifiers according to precision, recall and F-measure. In addition, detailed feature analysis was performed to reveal the characteristics of a successful cancer classification model. Results The system was highly effective at identifying cancer as the underlying cause of death (F-measure 0.94). The system was also effective at determining the type of cancer for common cancers (F-measure 0.7). Rare cancers, for which there was little training data, were difficult to classify accurately (F-measure 0.12). Factors influencing performance were the amount of training data and certain ambiguous cancers (e.g., those in the stomach region). The feature analysis revealed a combination of features were important for cancer type classification, with SNOMED CT concept and oncology specific morphology features proving the most valuable. Conclusion The system proposed in this study provides automatic identification and characterisation of cancers from large collections of free-text death certificates. This allows organisations such as Cancer Registries to monitor and report on cancer mortality in a timely and accurate manner. In addition, the methods and findings are generally applicable beyond cancer classification and to other sources of medical text besides death certificates.
Resumo:
In the Queensland, Australia, scallop fishery, the scallop catch is graded at sea using a specially designed grading machine called a "tumbler." Experiments were conducted to determine the effect of repeated trawl capture, grading, and discarding on the survival of sublegal saucer scallops Amusium balloti. Scallops were caught within an area closed to commercial fishing and known to contain dense scallop beds. The trawled scallops were randomly divided into 2 groups, tumbled and control, and subjected to up to 4 tumbles and/or trawls before being caged for 2.5 days adjacent to the trawl grounds. Increased levels of both trawling and tumbling were found to decrease significantly the survival of sublegal scallops. Although 83% of scallops survived repeated intensive trawling (4 consecutive tows), survival fell to 64% when scallops were also graded using a commercial tumbler. Survival was high for both tumbled and control sublegal scallops after 1 trawl (97% and 98%, respectively).
Resumo:
Al-Si-graphite particle composite alloy pistons containing different percentages of about 80 μm uncoated graphite particles were successfully cast by foundry techniques. Tests with a 5 hp single-cylinder diesel engine show that Al-Si-graphite particle composite pistons can withstand an endurance test of 500 h without any apparent deterioration and do not seize during the running-in period. The use of the Al-Si-3% graphite particle composite piston also results in (a) up to 3% reduction in the specific fuel consumption, (b) considerable reduction in the wear of all four piston rings, (c) a reduction in piston wear, (d) a 9% reduction in the frictional horsepower losses of the engine as determined by the motoring test and (e) a slight increase in the exhaust gas temperature. These reductions (a)–(d) appear to be due to increased lubrication from the graphite particles which are smeared on the bearing surface, the higher damping capacity of the composite pistons and the reduced coefficient of thermal expansion of the composite pistons. Preliminary results indicate that aluminum-graphite particle composite alloy is a promising material for automotive pistons.
Resumo:
This paper presents an inverse dynamic formulation by the Newton–Euler approach for the Stewart platform manipulator of the most general architecture and models all the dynamic and gravity effects as well as the viscous friction at the joints. It is shown that a proper elimination procedure results in a remarkably economical and fast algorithm for the solution of actuator forces, which makes the method quite suitable for on-line control purposes. In addition, the parallelism inherent in the manipulator and in the modelling makes the algorithm quite efficient in a parallel computing environment, where it can be made as fast as the corresponding formulation for the 6-dof serial manipulator. The formulation has been implemented in a program and has been used for a few trajectories planned for a test manipulator. Results of simulation presented in the paper reveal the nature of the variation of actuator forces in the Stewart platform and justify the dynamic modelling for control.
Resumo:
Separation of Mussorie rock phosphate (P2O5 = 20%) from Uttar Pradesh, India, containing pyrite, calcite and other carbonaceous impurities by flotation has been successfully attempted to upgrade the phosphate values. Based on Hallimond cell flotation results of single and synthetic mineral mixtures of calcite and apatite using oleic acid and potassium phosphate, conditions were obtained for the separation of calcite from apatite which is considered to be the most difficult step in the beneficiation of calcareous phosphates. Further studies using 250 g of the mineral (−60 +150 and −150 mesh fractions, deslimed) in laboratory size Fagergren subaeration machine employed a stagewise flotation viz. carbonaceous materials using terpineol, pyrite using potassium-ethyl xanthate and calcite using oleic acid respectively. Separation was, however, found to be unsatisfactory in the absence of a depressant. Among starch, hydrofluosilicic acid and dipotassium hydrogen phosphate, which were tried as depressants for apatite in the final flotation stage, dipotassium hydrogen phosphate proved to be superior to others. However, the tests with the above fractions did not yield the required grade. This was possibly due to insufficient liberation of the phosphate mineral from the ore body and different experimental conditions due to scale up operations. Experiments conducted using −200 mesh deslimed fractions has yielded an acceptable grade of 27.6% P2O5 with a recovery of about 60%. The results have been explained in terms of the specific adsorption characteristics of phosphate ions on apatite and the liberation size of the mineral.