746 resultados para parallel kinematic machine


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical machines have significant improvement potential. Nevertheless, the field is characterized by incremental innovations. Admittedly, steady improvement has been achieved, but no breakthrough development. Radical development in the field would require the introduction of new elements, such that may change the whole electrical machine industry system. Recent technological advancements in nanomaterials have opened up new horizons for the macroscopic application of carbon nanotube (CNT) fibres. With values of 100 MS/m measured on individual CNTs, CNT fibre materials hold promise for conductivities far beyond those of metals. Highly conductive, lightweight and strong CNT yarn is finally within reach; it could replace copper as a potentially better winding material. Although not yet providing low resistivity, the newest CNT yarn offers attractive perspectives for accelerated efficiency improvement of electrical machines. In this article, the potential for using new CNT materials to replace copper in machine windings is introduced. It does so, firstly, by describing the environment for a change that could revolutionize the industry and, secondly, by presenting the breakthrough results of a prototype construction. In the test motor, which is to our knowledge the first in its kind, the presently most electrically conductive carbon nanotube yarn replaces usual copper in the windings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A direct-driven permanent magnet synchronous machine for a small urban use electric vehicle is presented. The measured performance of the machine at the test bench as well as the performance over the modified New European Drive Cycle will be given. The effect of optimal current components, maximizing the efficiency and taking into account the iron loss, is compared with the simple id=0 – control. The machine currents and losses during the drive cycle are calculated and compared with each other.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to analyze the alterations of arm and leg movements of patients during stroke gait. Joint angles of upper and lower limbs and spatiotemporal variables were evaluated in two groups: hemiparetic group (HG, 14 hemiparetic men, 53 ± 10 years) and control group (CG, 7 able-bodied men, 50 ± 4 years). The statistical analysis was based on the following comparisons (P ≤ 0.05): 1) right versus left sides of CG; 2) affected (AF) versus unaffected (UF) sides of HG; 3) CG versus both the affected and unaffected sides of HG, and 4) an intracycle comparison of the kinematic continuous angular variables between HG and CG. This study showed that the affected upper limb motion in stroke gait was characterized by a decreased range of motion of the glenohumeral (HG: 6.3 ± 4.5, CG: 20.1 ± 8.2) and elbow joints (AF: 8.4 ± 4.4, UF: 15.6 ± 7.6) on the sagittal plane and elbow joint flexion throughout the cycle (AF: 68.2 ± 0.4, CG: 46.8 ± 2.7). The glenohumeral joint presented a higher abduction angle (AF: 14.2 ± 1.6, CG: 11.5 ± 4.0) and a lower external rotation throughout the cycle (AF: 4.6 ± 1.2, CG: 22.0 ± 3.0). The lower limbs showed typical alterations of the stroke gait patterns. Thus, the changes in upper and lower limb motion of stroke gait were identified. The description of upper limb motion in stroke gait is new and complements gait analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Myocardial ischemia, as well as the induction agents used in anesthesia, may cause corrected QT interval (QTc) prolongation. The objective of this randomized, double-blind trial was to determine the effects of high- vs conventional-dose bolus rocuronium on QTc duration and the incidence of dysrhythmias following anesthesia induction and intubation. Fifty patients about to undergo coronary artery surgery were randomly allocated to receive conventional-dose (0.6 mg/kg, group C, n=25) or high-dose (1.2 mg/kg, group H, n=25) rocuronium after induction with etomidate and fentanyl. QTc, heart rate, and mean arterial pressure were recorded before induction (T0), after induction (T1), after rocuronium (just before laryngoscopy; T2), 2 min after intubation (T3), and 5 min after intubation (T4). The occurrence of dysrhythmias was recorded. In both groups, QTc was significantly longer at T3 than at baseline [475 vs 429 ms in group C (P=0.001), and 459 vs 434 ms in group H (P=0.005)]. The incidence of dysrhythmias in group C (28%) and in group H (24%) was similar. The QTc after high-dose rocuronium was not significantly longer than after conventional-dose rocuronium in patients about to undergo coronary artery surgery who were induced with etomidate and fentanyl. In both groups, compared with baseline, QTc was most prolonged at 2 min after intubation, suggesting that QTc prolongation may be due to the nociceptive stimulus of intubation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neoadjuvant chemotherapy has practical and theoretical advantages over adjuvant chemotherapy strategy in breast cancer (BC) management. Moreover, metronomic delivery has a more favorable toxicity profile. The present study examined the feasibility of neoadjuvant metronomic chemotherapy in two cohorts [HER2+ (TraQme) and HER2− (TAME)] of locally advanced BC. Twenty patients were prospectively enrolled (TraQme, n=9; TAME, n=11). Both cohorts received weekly paclitaxel at 100 mg/m2 during 8 weeks followed by weekly doxorubicin at 24 mg/m2 for 9 weeks in combination with oral cyclophosphamide at 100 mg/day (fixed dose). The HER2+ cohort received weekly trastuzumab. The study was interrupted because of safety issues. Thirty-six percent of patients in the TAME cohort and all patients from the TraQme cohort had stage III BC. Of note, 33% from the TraQme cohort and 66% from the TAME cohort displayed hormone receptor positivity in tumor tissue. The pathological complete response rates were 55% and 18% among patients enrolled in the TraQme and TAME cohorts, respectively. Patients in the TraQme cohort had more advanced BC stages at diagnosis, higher-grade pathological classification, and more tumors lacking hormone receptor expression, compared to the TAME cohort. The toxicity profile was also different. Two patients in the TraQme cohort developed pneumonitis, and in the TAME cohort we observed more hematological toxicity and hand-foot syndrome. The neoadjuvant metronomic chemotherapy regimen evaluated in this trial was highly effective in achieving a tumor response, especially in the HER2+ cohort. Pneumonitis was a serious, unexpected adverse event observed in this group. Further larger and randomized trials are warranted to evaluate the association between metronomic chemotherapy and trastuzumab treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The whole research of the current Master Thesis project is related to Big Data transfer over Parallel Data Link and my main objective is to assist the Saint-Petersburg National Research University ITMO research team to accomplish this project and apply Green IT methods for the data transfer system. The goal of the team is to transfer Big Data by using parallel data links with SDN Openflow approach. My task as a team member was to compare existing data transfer applications in case to verify which results the highest data transfer speed in which occasions and explain the reasons. In the context of this thesis work a comparison between 5 different utilities was done, which including Fast Data Transfer (FDT), BBCP, BBFTP, GridFTP, and FTS3. A number of scripts where developed which consist of creating random binary data to be incompressible to have fair comparison between utilities, execute the Utilities with specified parameters, create log files, results, system parameters, and plot graphs to compare the results. Transferring such an enormous variety of data can take a long time, and hence, the necessity appears to reduce the energy consumption to make them greener. In the context of Green IT approach, our team used Cloud Computing infrastructure called OpenStack. It’s more efficient to allocated specific amount of hardware resources to test different scenarios rather than using the whole resources from our testbed. Testing our implementation with OpenStack infrastructure results that the virtual channel does not consist of any traffic and we can achieve the highest possible throughput. After receiving the final results we are in place to identify which utilities produce faster data transfer in different scenarios with specific TCP parameters and we can use them in real network data links.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the use of machine vision in RDF quality assurance and manufacturing. Currently machine vision is used in recycling and material detection and some commer- cial products are available in the market. In this thesis an on-line machine vision system is proposed for characterizing particle size. The proposed machine vision system is based on the mapping between image segmenta- tion and the ground truth of the particle size. The results shows that the implementation of such machine vision system is feasible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Personalized medicine will revolutionize our capabilities to combat disease. Working toward this goal, a fundamental task is the deciphering of geneticvariants that are predictive of complex diseases. Modern studies, in the formof genome-wide association studies (GWAS) have afforded researchers with the opportunity to reveal new genotype-phenotype relationships through the extensive scanning of genetic variants. These studies typically contain over half a million genetic features for thousands of individuals. Examining this with methods other than univariate statistics is a challenging task requiring advanced algorithms that are scalable to the genome-wide level. In the future, next-generation sequencing studies (NGS) will contain an even larger number of common and rare variants. Machine learning-based feature selection algorithms have been shown to have the ability to effectively create predictive models for various genotype-phenotype relationships. This work explores the problem of selecting genetic variant subsets that are the most predictive of complex disease phenotypes through various feature selection methodologies, including filter, wrapper and embedded algorithms. The examined machine learning algorithms were demonstrated to not only be effective at predicting the disease phenotypes, but also doing so efficiently through the use of computational shortcuts. While much of the work was able to be run on high-end desktops, some work was further extended so that it could be implemented on parallel computers helping to assure that they will also scale to the NGS data sets. Further, these studies analyzed the relationships between various feature selection methods and demonstrated the need for careful testing when selecting an algorithm. It was shown that there is no universally optimal algorithm for variant selection in GWAS, but rather methodologies need to be selected based on the desired outcome, such as the number of features to be included in the prediction model. It was also demonstrated that without proper model validation, for example using nested cross-validation, the models can result in overly-optimistic prediction accuracies and decreased generalization ability. It is through the implementation and application of machine learning methods that one can extract predictive genotype–phenotype relationships and biological insights from genetic data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laser cutting implementation possibilities into paper making machine was studied as the main objective of the work. Laser cutting technology application was considered as a replacement tool for conventional cutting methods used in paper making machines for longitudinal cutting such as edge trimming at different paper making process and tambour roll slitting. Laser cutting of paper was tested in 70’s for the first time. Since then, laser cutting and processing has been applied for paper materials with different level of success in industry. Laser cutting can be employed for longitudinal cutting of paper web in machine direction. The most common conventional cutting methods include water jet cutting and rotating slitting blades applied in paper making machines. Cutting with CO2 laser fulfils basic requirements for cutting quality, applicability to material and cutting speeds in all locations where longitudinal cutting is needed. Literature review provided description of advantages, disadvantages and challenges of laser technology when it was applied for cutting of paper material with particular attention to cutting of moving paper web. Based on studied laser cutting capabilities and problem definition of conventional cutting technologies, preliminary selection of the most promising application area was carried out. Laser cutting (trimming) of paper web edges in wet end was estimated to be the most promising area where it can be implemented. This assumption was made on the basis of rate of web breaks occurrence. It was found that up to 64 % of total number of web breaks occurred in wet end, particularly in location of so called open draws where paper web was transferred unsupported by wire or felt. Distribution of web breaks in machine cross direction revealed that defects of paper web edge was the main reason of tearing initiation and consequent web break. The assumption was made that laser cutting was capable of improvement of laser cut edge tensile strength due to high cutting quality and sealing effect of the edge after laser cutting. Studies of laser ablation of cellulose supported this claim. Linear energy needed for cutting was calculated with regard to paper web properties in intended laser cutting location. Calculated linear cutting energy was verified with series of laser cutting. Practically obtained laser energy needed for cutting deviated from calculated values. This could be explained by difference in heat transfer via radiation in laser cutting and different absorption characteristics of dry and moist paper material. Laser cut samples (both dry and moist (dry matter content about 25-40%)) were tested for strength properties. It was shown that tensile strength and strain break of laser cut samples are similar to corresponding values of non-laser cut samples. Chosen method, however, did not address tensile strength of laser cut edge in particular. Thus, the assumption of improving strength properties with laser cutting was not fully proved. Laser cutting effect on possible pollution of mill broke (recycling of trimmed edge) was carried out. Laser cut samples (both dry and moist) were tested on the content of dirt particles. The tests revealed that accumulation of dust particles on the surface of moist samples can take place. This has to be taken into account to prevent contamination of pulp suspension when trim waste is recycled. Material loss due to evaporation during laser cutting and amount of solid residues after cutting were evaluated. Edge trimming with laser would result in 0.25 kg/h of solid residues and 2.5 kg/h of lost material due to evaporation. Schemes of laser cutting implementation and needed laser equipment were discussed. Generally, laser cutting system would require two laser sources (one laser source for each cutting zone), set of beam transfer and focusing optics and cutting heads. In order to increase reliability of system, it was suggested that each laser source would have double capacity. That would allow to perform cutting employing one laser source working at full capacity for both cutting zones. Laser technology is in required level at the moment and do not require additional development. Moreover, capacity of speed increase is high due to availability high power laser sources what can support the tendency of speed increase of paper making machines. Laser cutting system would require special roll to maintain cutting. The scheme of such roll was proposed as well as roll integration into paper making machine. Laser cutting can be done in location of central roll in press section, before so-called open draw where many web breaks occur, where it has potential to improve runability of a paper making machine. Economic performance of laser cutting was done as comparison of laser cutting system and water jet cutting working in the same conditions. It was revealed that laser cutting would still be about two times more expensive compared to water jet cutting. This is mainly due to high investment cost of laser equipment and poor energy efficiency of CO2 lasers. Another factor is that laser cutting causes material loss due to evaporation whereas water jet cutting almost does not cause material loss. Despite difficulties of laser cutting implementation in paper making machine, its implementation can be beneficial. The crucial role in that is possibility to improve cut edge strength properties and consequently reduce number of web breaks. Capacity of laser cutting to maintain cutting speeds which exceed current speeds of paper making machines what is another argument to consider laser cutting technology in design of new high speed paper making machines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The subject of the thesis is automatic sentence compression with machine learning, so that the compressed sentences remain both grammatical and retain their essential meaning. There are multiple possible uses for the compression of natural language sentences. In this thesis the focus is generation of television program subtitles, which often are compressed version of the original script of the program. The main part of the thesis consists of machine learning experiments for automatic sentence compression using different approaches to the problem. The machine learning methods used for this work are linear-chain conditional random fields and support vector machines. Also we take a look which automatic text analysis methods provide useful features for the task. The data used for machine learning is supplied by Lingsoft Inc. and consists of subtitles in both compressed an uncompressed form. The models are compared to a baseline system and comparisons are made both automatically and also using human evaluation, because of the potentially subjective nature of the output. The best result is achieved using a CRF - sequence classification using a rich feature set. All text analysis methods help classification and most useful method is morphological analysis. Tutkielman aihe on suomenkielisten lauseiden automaattinen tiivistäminen koneellisesti, niin että lyhennetyt lauseet säilyttävät olennaisen informaationsa ja pysyvät kieliopillisina. Luonnollisen kielen lauseiden tiivistämiselle on monta käyttötarkoitusta, mutta tässä tutkielmassa aihetta lähestytään television ohjelmien tekstittämisen kautta, johon käytännössä kuuluu alkuperäisen tekstin lyhentäminen televisioruudulle paremmin sopivaksi. Tutkielmassa kokeillaan erilaisia koneoppimismenetelmiä tekstin automaatiseen lyhentämiseen ja tarkastellaan miten hyvin erilaiset luonnollisen kielen analyysimenetelmät tuottavat informaatiota, joka auttaa näitä menetelmiä lyhentämään lauseita. Lisäksi tarkastellaan minkälainen lähestymistapa tuottaa parhaan lopputuloksen. Käytetyt koneoppimismenetelmät ovat tukivektorikone ja lineaarisen sekvenssin mallinen CRF. Koneoppimisen tukena käytetään tekstityksiä niiden eri käsittelyvaiheissa, jotka on saatu Lingsoft OY:ltä. Luotuja malleja vertaillaan Lopulta mallien lopputuloksia evaluoidaan automaattisesti ja koska teksti lopputuksena on jossain määrin subjektiivinen myös ihmisarviointiin perustuen. Vertailukohtana toimii kirjallisuudesta poimittu menetelmä. Tutkielman tuloksena paras lopputulos saadaan aikaan käyttäen CRF sekvenssi-luokittelijaa laajalla piirrejoukolla. Kaikki kokeillut teksin analyysimenetelmät auttavat luokittelussa, joista tärkeimmän panoksen antaa morfologinen analyysi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of flat breads depends in part on the textural properties of breads during storage. These properties are largely affected by flour protein quality and quantity. The present study aimed to examine differences between sensory properties, textural and staling of Tandoori breads made from flours of different quality and different quantities of protein. This was implemented by using three flours with 9.4, 11.5 and 13.5% protein contents and different protein qualities shown by Zeleney sedimentation volume 16.25, 22.75 and 23.25 mL respectively. Bread strips were submitted to uniaxial compression between two parallel plates on an Instron Universal Testing machine, and firmness of the breads was determined. Results indicated the differences in the sensory attributes of breads produced by flours of different protein content and quality, demonstrating that high protein high quality flours are not able to sheet and expand under the high temperature - short time conditions employed in Taftoon bread production and are therefore not suitable for this kind of bread. Results showed that flour with 11.5% protein content, produced bread with better sensory characteristics and acceptable storage time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global energy consumption has been increasing yearly and a big portion of it is used in rotating electrical machineries. It is clear that in these machines energy should be used efficiently. In this dissertation the aim is to improve the design process of high-speed electrical machines especially from the mechanical engineering perspective in order to achieve more reliable and efficient machines. The design process of high-speed machines is challenging due to high demands and several interactions between different engineering disciplines such as mechanical, electrical and energy engineering. A multidisciplinary design flow chart for a specific type of high-speed machine in which computer simulation is utilized is proposed. In addition to utilizing simulation parallel with the design process, two simulation studies are presented. The first is used to find the limits of two ball bearing models. The second is used to study the improvement of machine load capacity in a compressor application to exceed the limits of current machinery. The proposed flow chart and simulation studies show clearly that improvements in the high-speed machinery design process can be achieved. Engineers designing in high-speed machines can utilize the flow chart and simulation results as a guideline during the design phase to achieve more reliable and efficient machines that use energy efficiently in required different operation conditions.