966 resultados para Observational techniques and algorithms
Resumo:
OBJECTIVE To evaluate immediate transcatheter aortic valve implantation (TAVI) results and medium-term follow-up in very elderly patients with severe and symptomatic aortic stenosis (AS). METHODS This multicenter, observational and prospective study was carried out in three hospitals. We included consecutive very elderly (> 85 years) patients with severe AS treated by TAVI. The primary endpoint was to evaluate death rates from any cause at two years. RESULTS The study included 160 consecutive patients with a mean age of 87 ± 2.1 years (range from 85 to 94 years) and a mean logistic EuroSCORE of 18.8% ± 11.2% with 57 (35.6%) patients scoring ≥ 20%. Procedural success rate was 97.5%, with 25 (15.6%) patients experiencing acute complications with major bleeding (the most frequent). Global mortality rate during hospitalization was 8.8% (n = 14) and 30-day mortality rate was 10% (n = 16). Median follow up period was 252.24 ± 232.17 days. During the follow-up period, 28 (17.5%) patients died (17 of them due to cardiac causes). The estimated two year overall and cardiac survival rates using the Kaplan-Meier method were 71% and 86.4%, respectively. Cox proportional hazard regression showed that the variable EuroSCORE ≥ 20 was the unique variable associated with overall mortality. CONCLUSIONS TAVI is safe and effective in a selected population of very elderly patients. Our findings support the adoption of this new procedure in this complex group of patients.
Resumo:
OBJECTIVE To assess Spanish and Portuguese patients' and physicians' preferences regarding type 2 diabetes mellitus (T2DM) treatments and the monthly willingness to pay (WTP) to gain benefits or avoid side effects. METHODS An observational, multicenter, exploratory study focused on routine clinical practice in Spain and Portugal. Physicians were recruited from multiple hospitals and outpatient clinics, while patients were recruited from eleven centers operating in the public health care system in different autonomous communities in Spain and Portugal. Preferences were measured via a discrete choice experiment by rating multiple T2DM medication attributes. Data were analyzed using the conditional logit model. RESULTS Three-hundred and thirty (n=330) patients (49.7% female; mean age 62.4 [SD: 10.3] years, mean T2DM duration 13.9 [8.2] years, mean body mass index 32.5 [6.8] kg/m(2), 41.8% received oral + injected medication, 40.3% received oral, and 17.6% injected treatments) and 221 physicians from Spain and Portugal (62% female; mean age 41.9 [SD: 10.5] years, 33.5% endocrinologists, 66.5% primary-care doctors) participated. Patients valued avoiding a gain in bodyweight of 3 kg/6 months (WTP: €68.14 [95% confidence interval: 54.55-85.08]) the most, followed by avoiding one hypoglycemic event/month (WTP: €54.80 [23.29-82.26]). Physicians valued avoiding one hypoglycemia/week (WTP: €287.18 [95% confidence interval: 160.31-1,387.21]) the most, followed by avoiding a 3 kg/6 months gain in bodyweight and decreasing cardiovascular risk (WTP: €166.87 [88.63-843.09] and €154.30 [98.13-434.19], respectively). Physicians and patients were willing to pay €125.92 (73.30-622.75) and €24.28 (18.41-30.31), respectively, to avoid a 1% increase in glycated hemoglobin, and €143.30 (73.39-543.62) and €42.74 (23.89-61.77) to avoid nausea. CONCLUSION Both patients and physicians in Spain and Portugal are willing to pay for the health benefits associated with improved diabetes treatment, the most important being to avoid hypoglycemia and gaining weight. Decreased cardiovascular risk and weight reduction became the third most valued attributes for physicians and patients, respectively.
Resumo:
Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable
Resumo:
Much medical research is observational. The reporting of observational studies is often of insufficient quality. Poor reporting hampers the assessment of the strengths and weaknesses of a study and the generalisability of its results. Taking into account empirical evidence and theoretical considerations, a group of methodologists, researchers, and editors developed the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) recommendations to improve the quality of reporting of observational studies. The STROBE Statement consists of a checklist of 22 items, which relate to the title, abstract, introduction, methods, results and discussion sections of articles. Eighteen items are common to cohort studies, case-control studies and cross-sectional studies and four are specific to each of the three study designs. The STROBE Statement provides guidance to authors about how to improve the reporting of observational studies and facilitates critical appraisal and interpretation of studies by reviewers, journal editors and readers. This explanatory and elaboration document is intended to enhance the use, understanding, and dissemination of the STROBE Statement. The meaning and rationale for each checklist item are presented. For each item, one or several published examples and, where possible, references to relevant empirical studies and methodological literature are provided. Examples of useful flow diagrams are also included. The STROBE Statement, this document, and the associated Web site (http://www.strobe-statement.org/) should be helpful resources to improve reporting of observational research.
Resumo:
Voting Advice Applications (VAAs) have become a central component of election campaigns worldwide. Through matching political preferences of voters to parties and candidates, the web application grants voters a look into their political mirror and reveals the most suitable political choices to them in terms of policy congruence. Both the dense and concise information on the electoral offer and the comparative nature of the application make VAAs an unprecedented information source for electoral decision making. In times where electoral choices are found to be highly individualized and driven by political issue positions, an ever increasing number of voters turn to VAAs before casting their ballots. With VAAs in high demand, the question of their effects on voters has become a pressing research topic. In various countries, survey research has been used to proclaim an impact of VAAs on electoral behavior, yet practically all studies fail to provide the scientific evidence that would allow for making such claims. In this thesis, I set out to systematically establish the causal link between VAA use and electoral behavior, using various data sources and appropriate statistical techniques in doing so. The focus lies on the Swiss VAA smartvote, introduced in the forefront of the 2003 Swiss federal elections and meanwhile an integral part of the national election campaign, smartvote has produced over a million voting recommendations in the last Swiss federal elections to an active electorate of two million, potentially guiding a vast amount of voters in their choices on the ballot. In order to determine the effect of the VAA on electoral behavior, I analyze both voting preferences and choice among Swiss voters during two consecutive election periods. First, I introduce statistical techniques to adequately examine VAA effects in observational studies and use them to demonstrate that voters who used smartvote prior to the 2007 Swiss federal elections were significantly more likely to swing vote in the elections than non- users. Second, I analyze preference voting during the same election and show that the smartvote voting recommendation inclines politically knowledgeable voters to modify their ballots and cast candidate specific preference votes. Third, to further tackle the indication that smartvote use affects the preference structure of voters, I employ an experimental research design to demonstrate that voters who use the application tend to strengthen their vote propensities for their most preferred party and adapt their overall party preferences in a way that they consider more than one party as eligible vote options after engaging with the application. Finally, vote choice is examined for the 2011 Swiss federal election, showing once more that the VAA initiated a change of party choice among voters. In sum, this thesis presents empirical evidence for the transformative effect of the Swiss VAA smartvote on the electoral behavior.
Resumo:
PURPOSE: To determine the lower limit of dose reduction with hybrid and fully iterative reconstruction algorithms in detection of endoleaks and in-stent thrombus of thoracic aorta with computed tomographic (CT) angiography by applying protocols with different tube energies and automated tube current modulation. MATERIALS AND METHODS: The calcification insert of an anthropomorphic cardiac phantom was replaced with an aortic aneurysm model containing a stent, simulated endoleaks, and an intraluminal thrombus. CT was performed at tube energies of 120, 100, and 80 kVp with incrementally increasing noise indexes (NIs) of 16, 25, 34, 43, 52, 61, and 70 and a 2.5-mm section thickness. NI directly controls radiation exposure; a higher NI allows for greater image noise and decreases radiation. Images were reconstructed with filtered back projection (FBP) and hybrid and fully iterative algorithms. Five radiologists independently analyzed lesion conspicuity to assess sensitivity and specificity. Mean attenuation (in Hounsfield units) and standard deviation were measured in the aorta to calculate signal-to-noise ratio (SNR). Attenuation and SNR of different protocols and algorithms were analyzed with analysis of variance or Welch test depending on data distribution. RESULTS: Both sensitivity and specificity were 100% for simulated lesions on images with 2.5-mm section thickness and an NI of 25 (3.45 mGy), 34 (1.83 mGy), or 43 (1.16 mGy) at 120 kVp; an NI of 34 (1.98 mGy), 43 (1.23 mGy), or 61 (0.61 mGy) at 100 kVp; and an NI of 43 (1.46 mGy) or 70 (0.54 mGy) at 80 kVp. SNR values showed similar results. With the fully iterative algorithm, mean attenuation of the aorta decreased significantly in reduced-dose protocols in comparison with control protocols at 100 kVp (311 HU at 16 NI vs 290 HU at 70 NI, P ≤ .0011) and 80 kVp (400 HU at 16 NI vs 369 HU at 70 NI, P ≤ .0007). CONCLUSION: Endoleaks and in-stent thrombus of thoracic aorta were detectable to 1.46 mGy (80 kVp) with FBP, 1.23 mGy (100 kVp) with the hybrid algorithm, and 0.54 mGy (80 kVp) with the fully iterative algorithm.
Resumo:
The activated sludge process - the main biological technology usually applied towastewater treatment plants (WWTP) - directly depends on live beings (microorganisms), and therefore on unforeseen changes produced by them. It could be possible to get a good plant operation if the supervisory control system is able to react to the changes and deviations in the system and can take thenecessary actions to restore the system’s performance. These decisions are oftenbased both on physical, chemical, microbiological principles (suitable to bemodelled by conventional control algorithms) and on some knowledge (suitable to be modelled by knowledge-based systems). But one of the key problems in knowledge-based control systems design is the development of an architecture able to manage efficiently the different elements of the process (integrated architecture), to learn from previous cases (spec@c experimental knowledge) and to acquire the domain knowledge (general expert knowledge). These problems increase when the process belongs to an ill-structured domain and is composed of several complex operational units. Therefore, an integrated and distributed AIarchitecture seems to be a good choice. This paper proposes an integrated and distributed supervisory multi-level architecture for the supervision of WWTP, that overcomes some of the main troubles of classical control techniques and those of knowledge-based systems applied to real world systems
Resumo:
The objective in this Master’s Thesis was to determine VOC emissions from veneer drying in softwood plywood manufacturing. Emissions from plywood industry have become an important factor because of the tightened regulations worldwide. In this Thesis is researched quality and quantity of the VOCs from softwood veneer drying. One of the main objectives was to find out suitable cleaning techniques for softwood VOC emissions. In introduction part is presented veneer drying machines, wood mechanical and chemical properties. VOC control techniques and specified VOC limits are introduced also in the introduction part. Plywood mills have not had interest to VOC emissions previously nevertheless nowadays plywood mills worldwide must consider reduction of the emissions. This Thesis includes measuring of emissions from softwood veneer dryer, analyzation of measured test results and reviewing results. Different air conditions inside of the dryer were considered during planning of the measurements. Results of the emissions measurements were compared to the established laws. Results from this Thesis were softwood veneer dryer emissions in different air conditions. Emission control techniques were also studied for softwood veneer dryer emissions for further specific research.
Resumo:
Selling is much maligned, often under-valued subject whose inadequate showing in business schools is in inverse proportion to the many job opportunities it offers and the importance of salespeople bringing incomes to companies. The purpose of this research is to increase the understanding of customer-oriented selling and examine the influence of customer-oriented philosophy on selling process, the applicability of selling techniques to this philosophy and the importance of them to salespeople. The empirical section of the study is two-fold. Firstly, the data of qualitative part was collected by conducting five thematic interviews among sales consultants and case company representatives. The findings of the study indicate that customer-oriented selling requires the activity of salespeople. In the customer-oriented personal selling process, salespeople invest time in the preplanning, the need analysis and the benefit demonstration stages. However, the findings propose that salespeople today must also have the basic capabilities for executing the traditional sales process, and the balance between traditional and consultative selling process will change as the duration of the relationship between the salesperson and customer increases. The study also proposes that selling techniques still belong to the customer-oriented selling process, although their roles might be modest. This thesis mapped 75 selling techniques and the quantitative part of the study explored what selling techniques are considered to be important by salespeople in direct selling industry when they make sales with new and existing customers. Response rate of the survey was 69.5%.
Resumo:
Statistical analyses of measurements that can be described by statistical models are of essence in astronomy and in scientific inquiry in general. The sensitivity of such analyses, modelling approaches, and the consequent predictions, is sometimes highly dependent on the exact techniques applied, and improvements therein can result in significantly better understanding of the observed system of interest. Particularly, optimising the sensitivity of statistical techniques in detecting the faint signatures of low-mass planets orbiting the nearby stars is, together with improvements in instrumentation, essential in estimating the properties of the population of such planets, and in the race to detect Earth-analogs, i.e. planets that could support liquid water and, perhaps, life on their surfaces. We review the developments in Bayesian statistical techniques applicable to detections planets orbiting nearby stars and astronomical data analysis problems in general. We also discuss these techniques and demonstrate their usefulness by using various examples and detailed descriptions of the respective mathematics involved. We demonstrate the practical aspects of Bayesian statistical techniques by describing several algorithms and numerical techniques, as well as theoretical constructions, in the estimation of model parameters and in hypothesis testing. We also apply these algorithms to Doppler measurements of nearby stars to show how they can be used in practice to obtain as much information from the noisy data as possible. Bayesian statistical techniques are powerful tools in analysing and interpreting noisy data and should be preferred in practice whenever computational limitations are not too restrictive.
Resumo:
Graphene is a material with extraordinary properties. Its mechanical and electrical properties are unparalleled but the difficulties in its production are hindering its breakthrough in on applications. Graphene is a two-dimensional material made entirely of carbon atoms and it is only a single atom thick. In this work, properties of graphene and graphene based materials are described, together with their common preparation techniques and related challenges. This Thesis concentrates on the topdown techniques, in which natural graphite is used as a precursor for the graphene production. Graphite consists of graphene sheets, which are stacked together tightly. In the top-down techniques various physical or chemical routes are used to overcome the forces keeping the graphene sheets together, and many of them are described in the Thesis. The most common chemical method is the oxidisation of graphite with strong oxidants, which creates a water-soluble graphene oxide. The properties of graphene oxide differ significantly from pristine graphene and, therefore, graphene oxide is often reduced to form materials collectively known as reduced graphene oxide. In the experimental part, the main focus is on the chemical and electrochemical reduction of graphene oxide. A novel chemical route using vanadium is introduced and compared to other common chemical graphene oxide reduction methods. A strong emphasis is placed on electrochemical reduction of graphene oxide in various solvents. Raman and infrared spectroscopy are both used in in situ spectroelectrochemistry to closely monitor the spectral changes during the reduction process. These in situ techniques allow the precise control over the reduction process and even small changes in the material can be detected. Graphene and few layer graphene were also prepared using a physical force to separate these materials from graphite. Special adsorbate molecules in aqueous solutions, together with sonic treatment, produce stable dispersions of graphene and few layer graphene sheets in water. This mechanical exfoliation method damages the graphene sheets considerable less than the chemical methods, although it suffers from a lower yield.
Resumo:
Bioinformatics applies computers to problems in molecular biology. Previous research has not addressed edit metric decoders. Decoders for quaternary edit metric codes are finding use in bioinformatics problems with applications to DNA. By using side effect machines we hope to be able to provide efficient decoding algorithms for this open problem. Two ideas for decoding algorithms are presented and examined. Both decoders use Side Effect Machines(SEMs) which are generalizations of finite state automata. Single Classifier Machines(SCMs) use a single side effect machine to classify all words within a code. Locking Side Effect Machines(LSEMs) use multiple side effect machines to create a tree structure of subclassification. The goal is to examine these techniques and provide new decoders for existing codes. Presented are ideas for best practices for the creation of these two types of new edit metric decoders.
Resumo:
La variabilité spatiale et temporelle de l’écoulement en rivière contribue à créer une mosaïque d’habitat dynamique qui soutient la diversité écologique. Une des questions fondamentales en écohydraulique est de déterminer quelles sont les échelles spatiales et temporelles de variation de l’habitat les plus importantes pour les organismes à divers stades de vie. L’objectif général de la thèse consiste à examiner les liens entre la variabilité de l’habitat et le comportement du saumon Atlantique juvénile. Plus spécifiquement, trois thèmes sont abordés : la turbulence en tant que variable d’habitat du poisson, les échelles spatiales et temporelles de sélection de l’habitat et la variabilité individuelle du comportement du poisson. À l’aide de données empiriques détaillées et d’analyses statistiques variées, nos objectifs étaient de 1) quantifier les liens causaux entre les variables d’habitat du poisson « usuelles » et les propriétés turbulentes à échelles multiples; 2) tester l’utilisation d’un chenal portatif pour analyser l’effet des propriétés turbulentes sur les probabilités de capture de proie et du comportement alimentaire des saumons juvéniles; 3) analyser les échelles spatiales et temporelles de sélection de l’habitat dans un tronçon l’été et l’automne; 4) examiner la variation individuelle saisonnière et journalière des patrons d’activité, d’utilisation de l’habitat et de leur interaction; 5) investiguer la variation individuelle du comportement spatial en relation aux fluctuations environnementales. La thèse procure une caractérisation détaillée de la turbulence dans les mouilles et les seuils et montre que la capacité des variables d’habitat du poisson usuelles à expliquer les propriétés turbulentes est relativement basse, surtout dans les petites échelles, mais varie de façon importante entre les unités morphologiques. D’un point de vue pratique, ce niveau de complexité suggère que la turbulence devrait être considérée comme une variable écologique distincte. Dans une deuxième expérience, en utilisant un chenal portatif in situ, nous n’avons pas confirmé de façon concluante, ni écarté l’effet de la turbulence sur la probabilité de capture des proies, mais avons observé une sélection préférentielle de localisations où la turbulence était relativement faible. La sélection d’habitats de faible turbulence a aussi été observée en conditions naturelles dans une étude basée sur des observations pour laquelle 66 poissons ont été marqués à l’aide de transpondeurs passifs et suivis pendant trois mois dans un tronçon de rivière à l’aide d’un réseau d’antennes enfouies dans le lit. La sélection de l’habitat était dépendante de l’échelle d’observation. Les poissons étaient associés aux profondeurs modérées à micro-échelle, mais aussi à des profondeurs plus élevées à l’échelle des patchs. De plus, l’étendue d’habitats utilisés a augmenté de façon asymptotique avec l’échelle temporelle. L’échelle d’une heure a été considérée comme optimale pour décrire l’habitat utilisé dans une journée et l’échelle de trois jours pour décrire l’habitat utilisé dans un mois. Le suivi individuel a révélé une forte variabilité inter-individuelle des patrons d’activité, certains individus étant principalement nocturnes alors que d’autres ont fréquemment changé de patrons d’activité. Les changements de patrons d’activité étaient liés aux variables environnementales, mais aussi à l’utilisation de l’habitat des individus, ce qui pourrait signifier que l’utilisation d’habitats suboptimaux engendre la nécessité d’augmenter l’activité diurne, quand l’apport alimentaire et le risque de prédation sont plus élevés. La variabilité inter-individuelle élevée a aussi été observée dans le comportement spatial. La plupart des poissons ont présenté une faible mobilité la plupart des jours, mais ont occasionnellement effectué des mouvements de forte amplitude. En fait, la variabilité inter-individuelle a compté pour seulement 12-17% de la variabilité totale de la mobilité des poissons. Ces résultats questionnent la prémisse que la population soit composée de fractions d’individus sédentaires et mobiles. La variation individuelle journalière suggère que la mobilité est une réponse à des changements des conditions plutôt qu’à un trait de comportement individuel.
Resumo:
Parmi les méthodes d’estimation de paramètres de loi de probabilité en statistique, le maximum de vraisemblance est une des techniques les plus populaires, comme, sous des conditions l´egères, les estimateurs ainsi produits sont consistants et asymptotiquement efficaces. Les problèmes de maximum de vraisemblance peuvent être traités comme des problèmes de programmation non linéaires, éventuellement non convexe, pour lesquels deux grandes classes de méthodes de résolution sont les techniques de région de confiance et les méthodes de recherche linéaire. En outre, il est possible d’exploiter la structure de ces problèmes pour tenter d’accélerer la convergence de ces méthodes, sous certaines hypothèses. Dans ce travail, nous revisitons certaines approches classiques ou récemment d´eveloppées en optimisation non linéaire, dans le contexte particulier de l’estimation de maximum de vraisemblance. Nous développons également de nouveaux algorithmes pour résoudre ce problème, reconsidérant différentes techniques d’approximation de hessiens, et proposons de nouvelles méthodes de calcul de pas, en particulier dans le cadre des algorithmes de recherche linéaire. Il s’agit notamment d’algorithmes nous permettant de changer d’approximation de hessien et d’adapter la longueur du pas dans une direction de recherche fixée. Finalement, nous évaluons l’efficacité numérique des méthodes proposées dans le cadre de l’estimation de modèles de choix discrets, en particulier les modèles logit mélangés.
Resumo:
La scoliose idiopathique de l’adolescent (SIA) est une déformation tri-dimensionelle du rachis. Son traitement comprend l’observation, l’utilisation de corsets pour limiter sa progression ou la chirurgie pour corriger la déformation squelettique et cesser sa progression. Le traitement chirurgical reste controversé au niveau des indications, mais aussi de la chirurgie à entreprendre. Malgré la présence de classifications pour guider le traitement de la SIA, une variabilité dans la stratégie opératoire intra et inter-observateur a été décrite dans la littérature. Cette variabilité s’accentue d’autant plus avec l’évolution des techniques chirurgicales et de l’instrumentation disponible. L’avancement de la technologie et son intégration dans le milieu médical a mené à l’utilisation d’algorithmes d’intelligence artificielle informatiques pour aider la classification et l’évaluation tridimensionnelle de la scoliose. Certains algorithmes ont démontré être efficace pour diminuer la variabilité dans la classification de la scoliose et pour guider le traitement. L’objectif général de cette thèse est de développer une application utilisant des outils d’intelligence artificielle pour intégrer les données d’un nouveau patient et les évidences disponibles dans la littérature pour guider le traitement chirurgical de la SIA. Pour cela une revue de la littérature sur les applications existantes dans l’évaluation de la SIA fut entreprise pour rassembler les éléments qui permettraient la mise en place d’une application efficace et acceptée dans le milieu clinique. Cette revue de la littérature nous a permis de réaliser que l’existence de “black box” dans les applications développées est une limitation pour l’intégration clinique ou la justification basée sur les évidence est essentielle. Dans une première étude nous avons développé un arbre décisionnel de classification de la scoliose idiopathique basé sur la classification de Lenke qui est la plus communément utilisée de nos jours mais a été critiquée pour sa complexité et la variabilité inter et intra-observateur. Cet arbre décisionnel a démontré qu’il permet d’augmenter la précision de classification proportionnellement au temps passé à classifier et ce indépendamment du niveau de connaissance sur la SIA. Dans une deuxième étude, un algorithme de stratégies chirurgicales basé sur des règles extraites de la littérature a été développé pour guider les chirurgiens dans la sélection de l’approche et les niveaux de fusion pour la SIA. Lorsque cet algorithme est appliqué à une large base de donnée de 1556 cas de SIA, il est capable de proposer une stratégie opératoire similaire à celle d’un chirurgien expert dans prêt de 70% des cas. Cette étude a confirmé la possibilité d’extraire des stratégies opératoires valides à l’aide d’un arbre décisionnel utilisant des règles extraites de la littérature. Dans une troisième étude, la classification de 1776 patients avec la SIA à l’aide d’une carte de Kohonen, un type de réseaux de neurone a permis de démontrer qu’il existe des scoliose typiques (scoliose à courbes uniques ou double thoracique) pour lesquelles la variabilité dans le traitement chirurgical varie peu des recommandations par la classification de Lenke tandis que les scolioses a courbes multiples ou tangentielles à deux groupes de courbes typiques étaient celles avec le plus de variation dans la stratégie opératoire. Finalement, une plateforme logicielle a été développée intégrant chacune des études ci-dessus. Cette interface logicielle permet l’entrée de données radiologiques pour un patient scoliotique, classifie la SIA à l’aide de l’arbre décisionnel de classification et suggère une approche chirurgicale basée sur l’arbre décisionnel de stratégies opératoires. Une analyse de la correction post-opératoire obtenue démontre une tendance, bien que non-statistiquement significative, à une meilleure balance chez les patients opérés suivant la stratégie recommandée par la plateforme logicielle que ceux aillant un traitement différent. Les études exposées dans cette thèse soulignent que l’utilisation d’algorithmes d’intelligence artificielle dans la classification et l’élaboration de stratégies opératoires de la SIA peuvent être intégrées dans une plateforme logicielle et pourraient assister les chirurgiens dans leur planification préopératoire.