941 resultados para Multi-Criteria Optimisation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

20 years after the discovery of the first planets outside our solar system, the current exoplanetary population includes more than 700 confirmed planets around main sequence stars. Approximately 50% belong to multiple-planet systems in very diverse dynamical configurations, from two-planet hierarchical systems to multiple resonances that could only have been attained as the consequence of a smooth large-scale orbital migration. The first part of this paper reviews the main detection techniques employed for the detection and orbital characterization of multiple-planet systems, from the (now) classical radial velocity (RV) method to the use of transit time variations (TTV) for the identification of additional planetary bodies orbiting the same star. In the second part we discuss the dynamical evolution of multi-planet systems due to their mutual gravitational interactions. We analyze possible modes of motion for hierarchical, secular or resonant configurations, and what stability criteria can be defined in each case. In some cases, the dynamics can be well approximated by simple analytical expressions for the Hamiltonian function, while other configurations can only be studied with semi-analytical or numerical tools. In particular, we show how mean-motion resonances can generate complex structures in the phase space where different libration islands and circulation domains are separated by chaotic layers. In all cases we use real exoplanetary systems as working examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die optische Eigenschaften sowie der Oberflächenverstärkungseffekt von rauen Metalloberflächen sowie Nanopartikeln wurden intensiv für den infraroten Bereich des Spektrums in der Literatur diskutiert. Für die Präparation solcher Oberflächen gibt es prinzipiell zwei verschiedene Strategien, zum einen können die Nanopartikel zuerst ex-situ synthetisiert werden, der zweite Ansatz beruht darauf, dass die Nanopartikel in-situ hergestellt und aufgewachsen werden. Hierbei wurden beide Ansätze ausgetestet, dabei stellte sich heraus, dass man nur mittels der in-situ Synthese der Goldnanopartikel in der Lage ist nanostrukturierte Oberflächen zu erhalten, welche elektronisch leitfähig sind, nicht zu rau sind, um eine Membranbildung zu ermöglichen und gleichzeitig einen optimalen Oberflächenverstärkungseffekt zeigen. Obwohl keine ideale Form der Nanopartikel mittels der in-situ Synthese erhalten werden können, verhalten sich diese dennoch entsprechend der Theorie des Oberflächenverstärkungseffekts. Optimierungen der Form und Grösse der Nanopartikel führten in dieser Arbeit zu einer Optimierung des Verstärkungseffekts. Solche optimierten Oberflächen konnten einfach reproduziert werden und zeichnen sich durch eine hohe Stabilität aus. Der so erhaltene Oberflächenverstärkungseffekt beträgt absolut 128 verglichen mit dem belegten ATR-Kristall ohne Nanopartikel oder etwa 6 mal, verglichen mit der Oberfläche, die bis jetzt auch in unserer Gruppe verwendet wurde. Daher können nun Spektren erhalten werden, welche ein deutlich besseres Signal zu Rauschverhältnis (SNR) aufweisen, was die Auswertung und Bearbeitung der erhaltenen Spektren deutlich vereinfacht und verkürzt.rnNach der Optimierung der verwendeten Metalloberfläche und der verwendeten Messparameter am Beispiel von Cytochrom C wurde nun an der Oberflächenbelegung der deutlich größeren Cytochrom c Oxidase gearbeitet. Hierfür wurde der DTNTA-Linker ex-situ synthetisiert. Anschließend wurden gemischte Monolagen (self assembeld monolayers) aus DTNTA und DTP hergestellt. Die NTA-Funktionalität ist für die Anbindung der CcO mit der his-tag Technologie verantwortlich. Die Kriterien für eine optimale Linkerkonzentration waren die elektrischen Parameter der Schicht vor und nach Rekonstitution in eine Lipidmembran, sowie Elektronentransferraten bestimmt durch elektrochemische Messungen. Erst mit diesem optimierten System, welches zuverlässig und reproduzierbar funktioniert, konnten weitere Messungen an der CcO begonnen werden. Aus elektrochemischen Messungen war bekannt, dass die CcO durch direkten Elektronentransfer unter Sauerstoffsättigung in einen aktivierten Zustand überführt werden kann. Dieser aktivierte Zustand zeichnet sich durch eine Verschiebung der Redoxpotentiale um etwa 400mV gegenüber dem aus Gleichgewichts-Titrationen bekannten Redoxpotential aus. Durch SEIRAS konnte festgestellt werden, dass die Reduktion bzw. Oxidation aller Redoxzentren tatsächlich bei den in der Cyclovoltammetrie gemessenen Potentialen erfolgt. Außerdem ergaben die SEIRA-Spektren, dass durch direkten Elektronentransfer gravierende Konformationsänderungen innerhalb des Proteins stattfinden. rnBisher war man davon ausgegangen, aufgrund des Elektronentransfers mittels Mediatoren, dass nur minimale Konformationsänderungen beteiligt sind. Vor allem konnte erstmaligrnder aktivierte und nicht aktivierte Zustand der Cytochrom c Oxidase spektroskopisch nachweisen werden.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Acute hemodynamic instability increases morbidity and mortality. We investigated whether early non-invasive cardiac output monitoring enhances hemodynamic stabilization and improves outcome. Methods A multicenter, randomized controlled trial was conducted in three European university hospital intensive care units in 2006 and 2007. A total of 388 hemodynamically unstable patients identified during their first six hours in the intensive care unit (ICU) were randomized to receive either non-invasive cardiac output monitoring for 24 hrs (minimally invasive cardiac output/MICO group; n = 201) or usual care (control group; n = 187). The main outcome measure was the proportion of patients achieving hemodynamic stability within six hours of starting the study. Results The number of hemodynamic instability criteria at baseline (MICO group mean 2.0 (SD 1.0), control group 1.8 (1.0); P = .06) and severity of illness (SAPS II score; MICO group 48 (18), control group 48 (15); P = .86)) were similar. At 6 hrs, 45 patients (22%) in the MICO group and 52 patients (28%) in the control group were hemodynamically stable (mean difference 5%; 95% confidence interval of the difference -3 to 14%; P = .24). Hemodynamic support with fluids and vasoactive drugs, and pulmonary artery catheter use (MICO group: 19%, control group: 26%; P = .11) were similar in the two groups. The median length of ICU stay was 2.0 (interquartile range 1.2 to 4.6) days in the MICO group and 2.5 (1.1 to 5.0) days in the control group (P = .38). The hospital mortality was 26% in the MICO group and 21% in the control group (P = .34). Conclusions Minimally-invasive cardiac output monitoring added to usual care does not facilitate early hemodynamic stabilization in the ICU, nor does it alter the hemodynamic support or outcome. Our results emphasize the need to evaluate technologies used to measure stroke volume and cardiac output--especially their impact on the process of care--before any large-scale outcome studies are attempted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous studies have shown that collective property rights offer higher flexibility than individual property and improve sustainable community-based forest management. Our case study, carried out in the Beni department of Bolivia, does not contradict this assertion, but shows that collective rights have been granted in areas where ecological contexts and market facilities were less favourable to intensive land use. Previous experiences suggest investigating political processes in order to understand the criteria according to which access rights were distributed. Based on remote sensing and on a multi-level land governance framework, our research confirms that land placed under collective rights, compared to individual property, is less affected by deforestation among Andean settlements. However, analysis of the historical process of land distribution in the area shows that the distribution of property rights is the result of a political process based on economic, spatial, and environmental strategies that are defined by multiple stakeholders. Collective titles were established in the more remote areas and distributed to communities with lower productive potentialities. Land rights are thus a secondary factor of forest cover change which results from diverse political compromises based on population distribution, accessibility, environmental perceptions, and expected production or extraction incomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Retinal optical coherence tomography (OCT) permits quantification of retinal layer atrophy relevant to assessment of neurodegeneration in multiple sclerosis (MS). Measurement artefacts may limit the use of OCT to MS research. OBJECTIVE An expert task force convened with the aim to provide guidance on the use of validated quality control (QC) criteria for the use of OCT in MS research and clinical trials. METHODS A prospective multi-centre (n = 13) study. Peripapillary ring scan QC rating of an OCT training set (n = 50) was followed by a test set (n = 50). Inter-rater agreement was calculated using kappa statistics. Results were discussed at a round table after the assessment had taken place. RESULTS The inter-rater QC agreement was substantial (kappa = 0.7). Disagreement was found highest for judging signal strength (kappa = 0.40). Future steps to resolve these issues were discussed. CONCLUSION Substantial agreement for QC assessment was achieved with aid of the OSCAR-IB criteria. The task force has developed a website for free online training and QC certification. The criteria may prove useful for future research and trials in MS using OCT as a secondary outcome measure in a multi-centre setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: In professional soccer, talent selection relies on the subjective judgment of scouts and coaches. To date, little is known about coaches´ “eye for talent” (Christensen, 2009, p. 379) and the nature of the subjective criteria they use to identify those players with the greatest potential to achieve peak performance in adulthood (Williams & Reilly, 2000). Drawing on a constructivist approach (Kelly, 1991), this study explores coaches´ subjective talent criteria. It is assumed that coaches are able to verbalise and specify their talent criteria, and that these are related to their talent selection decisions based on instinct. Methods: Participants and generation of data. Five national youth soccer coaches (Mage = 55.6; SD = 5.03) were investigated at three appointments: (1) talent selection decision based on instinct, (2) semi-structured inductive interview to elicit each coaches´ talent criteria in detail, (3) communicative validation and evaluation of the players by each coach using the repertory grid technique (Fromm, 2004). Data Analysis: Interviews were transcribed and summarized with regard to each specified talent criterion. Each talent criterion was categorized using a bottom-up-approach (meaning categorization, Kvale, 1996). The repertory grid data was analysed using descriptive statistics and correlation analysis. Results and Discussion: For each coach, six to nine talent criteria were elicited and specified. The subjective talent criteria include aspects of personality, cognitive perceptual skills, motor abilities, development, technique, social environment and physical constitution, which shows that the coaches use a multi-dimensional concept of talent. However, more than half of all criteria describe personality characteristics, in particular achievement motivation, volition and self-confidence. In contrast to Morris (2000), this result shows that coaches have a differentiated view of the personality characteristics required to achieve peak performance. As an indication of criterion validity, moderate to high correlations (.57 ≤ r ≤ .81) are found between the evaluations of the players according to the coaches´ talent criteria and their talent selection decision. The study shows that coaches are able to specify their subject talent criteria and that those criteria are strongly related to their instinctive selection decisions. References: Christensen, M. K. (2009). "An Eye for Talent": Talent Identification and the "Practical Sense" of Top-Level Soccer Coaches. Sociology of Sport Journal, 26, 365–382. Fromm, M. (2004). Introduction to the Repertory Grid Interview. Münster: Waxmann. Kelly, G. A. (1991). The Psychology of Personal Constructs: Volume One: Theory and personality. London: Routledge. Kvale, S. (1996). InterViews: An introduction to Qualitative Research Interviewing. Thousand Oaks: Sage. Morris, T. (2000). Psychological characteristics and talent identification in soccer. Journal of Sports Sciences, 18, 715–726. Williams, A. M., & Reilly, T. (2000). Talent identification and development in soccer. Journal of Sports Sciences, 18, 657–667.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to examine the stages of program realization of the interventions that the Bronx Health REACH program initiated at various levels to improve nutrition as a means for reducing racial and ethnic disparities in diabetes. This study was based on secondary analyses of qualitative data collected through the Bronx Health REACH Nutrition Project, a project conducted under the auspices of the Institute on Urban Family Health, with support from the Centers for Disease Control and Prevention (CDC). Local human subjects' review and approval through the Institute on Urban Family Health was required and obtained in order to conduct the Bronx Health REACH Nutrition Project. ^ The study drew from two theoretical models—Glanz and colleagues' nutrition environments model and Shediac-Rizkallah and Bone's sustainability model. The specific study objectives were two-fold: (1) to categorize each nutrition activity to a specific dimension (i.e. consumer, organizational or community nutrition environment); and (2) to evaluate the stage at which the program has been realized (i.e. development, implementation or sustainability). ^ A case study approach was applied and a constant comparative method was used to analyze the data. Triangulation of data based was also conducted. Qualitative data from this study revealed the following principal findings: (1) communities of color are disproportionately experiencing numerous individual and environmental factors contributing to the disparities in diabetes; (2) multi-level strategies that targeted the individual, organizational and community nutrition environments can appropriately address these contributing factors; (3) the nutrition strategies greatly varied in their ability to appropriately meet criteria for the three program stages; and (4) those nutrition strategies most likely to succeed (a) conveyed consistent and culturally relevant messages, (b) had continued involvement from program staff and partners, (c) were able to adapt over time or setting, (d) had a program champion and a training component, (e) were integrated into partnering organizations, and (f) were perceived to be successful by program staff and partners in their efforts to create individual, organizational and community/policy change. As a result of the criteria-based assessment and qualitative findings, an ecological framework elaborating on Glanz and colleagues model was developed. The qualitative findings and the resulting ecological framework developed from this study will help public health professionals and community leaders to develop and implement sustainable multi-level nutrition strategies for addressing racial and ethnic disparities in diabetes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detailed analyses of the Lake Van pollen, Ca/K ratio and stable oxygen isotope record allow the identification of millennial-scale vegetation and environmental changes in eastern Anatolia throughout the last glacial (~75-15 ka BP). The climate within the last glacial was cold and dry, with low arboreal pollen (AP) levels. The driest and coldest period corresponds to Marine Isotope Stage (MIS) 2 (~28-14.5 ka BP) dominated by the highest values of xerophytic steppe vegetation. Our high-resolution multi proxy record shows rapid expansions and contractions of tree populations that reflects variability in temperature and moisture availability. This rapid vegetation and environmental changes can be linked to the stadial-interstadial pattern of the Dansgaard-Oeschger (DO) events as recorded in the Greenland ice cores. Periods of reduced moisture availability were characterized by enhanced xerophytic species and high terrigenous input from the Lake Van catchment area. Furthermore, comparison with the marine realm reveals that the complex atmosphere-ocean interaction can be explained by the strength and position of the westerlies, which is responsible for the supply of humidity in eastern Anatolia. Influenced by diverse topography of the Lake Van catchment, larger DO interstadials (e.g. DO 19, 17-16, 14, 12 and 8) show the highest expansion of temperate species within the last glacial. However, Heinrich events (HE), characterized by highest concentrations of ice-rafted debris (IRD) in marine sediments, are identified in eastern Anatolia by AP values not lower and high steppe components not more abundant than during DO stadials. In addition, this work is a first attempt to establish a continuous microscopic charcoal record over the last glacial in the Near East, which documents an initial immediate response to millennial-scale climate and environmental variability and enables us to shed light on the history of fire activity during the last glacial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A high-resolution multi-proxy record from Lake Van, eastern Anatolia, derived from a lacustrine sequence cored at the 357 m deep Ahlat Ridge (AR), allows a comprehensive view of paleoclimate and environmental history in the continental Near East during the last interglacial (LI). We combined paleovegetation (pollen), stable oxygen isotope (d18Obulk) and XRF data from the same sedimentary sequence, showing distinct variations during the period from 135 to 110 ka ago leading into and out of full interglacial conditions. The last interglacial plateau, as defined by the presence of thermophilous steppe-forest communities, lasted ca. 13.5 ka, from ~129.1-115.6 ka BP. The detailed palynological sequence at Lake Van documents a vegetation succession with several climatic phases: (I) the Pistacia zone (ca. 131.2-129.1 ka BP) indicates summer dryness and mild winter conditions during the initial warming, (II) the Quercus-Ulmus zone (ca. 129.1-127.2 ka BP) occurred during warm and humid climate conditions with enhanced evaporation, (III) the Carpinus zone (ca. 127.2-124.1 ka BP) suggest increasingly cooler and wetter conditions, and (IV) the expansion of Pinus at ~124.1 ka BP marks the onset of a colder/drier environment that extended into the interval of global ice growth. Pollen data suggest migration of thermophilous trees from refugial areas at the beginning of the last interglacial. Analogous to the current interglacial, the migration documents a time lag between the onset of climatic amelioration and the establishment of an oak steppe-forest, spanning 2.1 ka. Hence, the major difference between the last interglacial compared to the current interglacial (Holocene) is the abundance of Pinus as well as the decrease of deciduous broad-leaved trees, indicating higher continentality during the last interglacial. Finally, our results demonstrate intra-interglacial variability in the low mid-latitudes and suggest a close connection with the high-frequency climate variability recorded in Greenland ice cores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important objective of the INTEGRATE project1 is to build tools that support the efficient execution of post-genomic multi-centric clinical trials in breast cancer, which includes the automatic assessment of the eligibility of patients for available trials. The population suited to be enrolled in a trial is described by a set of free-text eligibility criteria that are both syntactically and semantically complex. At the same time, the assessment of the eligibility of a patient for a trial requires the (machineprocessable) understanding of the semantics of the eligibility criteria in order to further evaluate if the patient data available for example in the hospital EHR satisfies these criteria. This paper presents an analysis of the semantics of the clinical trial eligibility criteria based on relevant medical ontologies in the clinical research domain: SNOMED-CT, LOINC, MedDRA. We detect subsets of these widely-adopted ontologies that characterize the semantics of the eligibility criteria of trials in various clinical domains and compare these sets. Next, we evaluate the occurrence frequency of the concepts in the concrete case of breast cancer (which is our first application domain) in order to provide meaningful priorities for the task of binding/mapping these ontology concepts to the actual patient data. We further assess the effort required to extend our approach to new domains in terms of additional semantic mappings that need to be developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the different optical modulator technologies available such as polymer, III-V semiconductors, Silicon, the well-known Lithium Niobate (LN) offers the best trade-off in terms of performances, ease of use, and power handling capability [1-9]. The LN technology is still widely deployed within the current high data rate fibre optic communications networks. This technology is also the most mature and guarantees the reliability which is required for space applications [9].In or der to fulfil the target specifications of opto-microwave payloads, an optimization of the design of a Mach-Zehnder (MZ) modulator working at the 1500nm telecom wavelength was performed in the frame of the ESA-ARTES "Multi GigaHertz Optical Modulator" (MGOM) project in order to reach ultra-low optical insertion loss and low effective driving voltage in the Ka band. The selected modulator configuration was the X-cut crystal orientation, associated to high stability Titanium in-diffusion process for the optical waveguide. Starting from an initial modulator configuration exhibiting 9 V drive voltage @ 30 GHz, a complete redesign of the coplanar microwave electrodes was carried out in order to reach a 6 V drive voltage @ 30GHz version. This redesign was associated to an optimization of the interaction between the optical waveguide and the electrodes. Following the optimisation steps, an evaluation program was applied on a lot of 8 identical modulators. A full characterisation was carried out to compare performances, showing small variations between the initial and final functional characteristics. In parallel, two similar modulators were submitted to both gamma (10-100 krad) and proton irradiation (10.109 p/cm²) with minor performance degradation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Semantic interoperability is essential to facilitate efficient collaboration in heterogeneous multi-site healthcare environments. The deployment of a semantic interoperability solution has the potential to enable a wide range of informatics supported applications in clinical care and research both within as ingle healthcare organization and in a network of organizations. At the same time, building and deploying a semantic interoperability solution may require significant effort to carryout data transformation and to harmonize the semantics of the information in the different systems. Our approach to semantic interoperability leverages existing healthcare standards and ontologies, focusing first on specific clinical domains and key applications, and gradually expanding the solution when needed. An important objective of this work is to create a semantic link between clinical research and care environments to enable applications such as streamlining the execution of multi-centric clinical trials, including the identification of eligible patients for the trials. This paper presents an analysis of the suitability of several widely-used medical ontologies in the clinical domain: SNOMED-CT, LOINC, MedDRA, to capture the semantics of the clinical trial eligibility criteria, of the clinical trial data (e.g., Clinical Report Forms), and of the corresponding patient record data that would enable the automatic identification of eligible patients. Next to the coverage provided by the ontologies we evaluate and compare the sizes of the sets of relevant concepts and their relative frequency to estimate the cost of data transformation, of building the necessary semantic mappings, and of extending the solution to new domains. This analysis shows that our approach is both feasible and scalable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tool path generation is one of the most complex problems in Computer Aided Manufacturing. Although some efficient strategies have been developed, most of them are only useful for standard machining. However, the algorithms used for tool path computation demand a higher computation performance, which makes the implementation on many existing systems very slow or even impractical. Hardware acceleration is an incremental solution that can be cleanly added to these systems while keeping everything else intact. It is completely transparent to the user. The cost is much lower and the development time is much shorter than replacing the computers by faster ones. This paper presents an optimisation that uses a specific graphic hardware approach using the power of multi-core Graphic Processing Units (GPUs) in order to improve the tool path computation. This improvement is applied on a highly accurate and robust tool path generation algorithm. The paper presents, as a case of study, a fully implemented algorithm used for turning lathe machining of shoe lasts. A comparative study will show the gain achieved in terms of total computing time. The execution time is almost two orders of magnitude faster than modern PCs.