862 resultados para Branch-and-bound algorithm
Resumo:
The techno-economic implications of recycling the components of mixed plastics waste have been studied in a two-part investigation: (a) An economic survey of the prospects for plastics recycling, the plastics waste arisings from retailing, building, automotive, light engineering and chemical industries have been surveyed by mans of questionnaires and interviews. This was partially successful and indicated that very considerable quantities of relatively clean plastics packaging was available in major department chains and household stores. The possibility of devising collection systems for such sources, which do not lead to any extra cost, have been suggested. However, the household collection of plastics waste has been found to be uneconomic due to high cost of collection, transportation and lack of markets for the end products. (b) In a technical study of blends of PE/PP and PE/PS which are found in admixture in waste plastics, it has been shown that they exhibit poor mechanical properties due to incompatibility. Consequently reprocessing of such unsegregated blends results in products of little technological value. The inclusion of some commercial block and graft copolymers which behave as solid phase dispersants (SPES) increase the toughness of the blends (e.g. EPDM in PE/PP blend and SBS in PE/PS blend). Also, EPDM is found to be very effective for improving the toughness of single component polypropylene. However, the improved Technical properties of such blends have been accompanied by a fast rate of photo-oxidation and loss of toughness due to the presence of unsaturation in SPD's. The change in mechanical properties occurring during oven ageing and ultra-violet light accelerated weathering of these binary and ternary blends was followed by a viscoelastonetric technique (Rheovibron) over 9,, wide range of temperatures, impact resistance at room temperature (20-41'G) and changes in functional groups (i.e. carbonyl and trans-1,4-polybutadiene). Also the heat and light stability of single and mixed plastics to which thiol antioxidants were bound to SPE1 segment have been studied and compared with conventional antioxidants. The long-term performance of the mixed plastics containing SPE1 have been improved significantly by the use of conventional and bound antioxidants. It is concluded that an estimated amount of 30000 tonnes/year of plastics waste is available from department chains and household stores which can be converted to useful end products. This justifies pilot-experiments in collaboration with supermarkets, recyclers and converters by use of low cost SPD's and additives designed to make the materials more compatible.
Resumo:
Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.
Resumo:
This thesis presents results of experiments designed to study the effect of applying electrochemical chloride extraction (ECE) to a range of different hardened cement pastes. Rectangular prism specimens of hydrated cement paste containing sodium chloride at different concentrations were subjected to electrolysis between the embedded steel cathodes and external anodes of activated titanium mesh. The cathodic current density used was in the range of 1 to 5 A/m2 with treatment periods of 4 to 12 weeks. After treatment, the specimens were cut into sections which were subjected to pore-solution expression and analysis in order to determine changes in the distribution of free and total ionic species. The effect of the ECE treatment on the physical and microstructural properties of the cements was studied by using microhardness and MIP techniques. XRD was employed to look at the possibility of ettringite redistribution as a result of the accumulation of soluble sulphate ions in the cement matrix near the cathode during ECE. Remigration of chloride which remains after the ECE treatment and distribution of other ions were studied by analysing specimens which had been stored for several months, after undergoing ECE treatment. The potentials of the steel cathodes were also monitored over the period to detect any changes in their corrosion state. The main findings of this research were as follows: 1, ECE, as applied in this investigation, was capable of removing both free and bound chloride. The removal process occurred relatively quickly and an equilibrium between free and bound chlorides in the specimens was maintained throughout. At the same time, alkali concentrations in the pore solution near the steel cathode increased. The soluble sulphate ionic concentration near the cathode also increased due to the local increase in the pH of the pore solution. 2, ECE caused some changes in physical and microstructural of the cement matrix. However these changes were minimal and in the case of microhardness, the results were highly scattered. Ettringite in the bulk material well away from the cathode was found not to increase significantly with the increase in charge passed.3, Remigration of chloride and other ionic species occurred slowly after cessation of ECE with a resultant gradual increase in the Cl-/OH- ratio around the steel.4, The removal of chloride from blended cements was slower than that from OPC.
Resumo:
This thesis documents an investigation of the effect of solar radiation pressure on the motion of an artificial satellite. Consideration is given to the methods required for the inclusion of the discontinuous effect of the Earth's shadow. The analysis resulting from the description of a deformed diffusely reflecting balloon satellite and an algorithm describing the effects of Earth reflected solar radiation pressure are developed, culminating in the application of the derived theory to the orbital data of the balloon satellite, Explorer 19.
Resumo:
Architecture and learning algorithm of self-learning spiking neural network in fuzzy clustering task are outlined. Fuzzy receptive neurons for pulse-position transformation of input data are considered. It is proposed to treat a spiking neural network in terms of classical automatic control theory apparatus based on the Laplace transform. It is shown that synapse functioning can be easily modeled by a second order damped response unit. Spiking neuron soma is presented as a threshold detection unit. Thus, the proposed fuzzy spiking neural network is an analog-digital nonlinear pulse-position dynamic system. It is demonstrated how fuzzy probabilistic and possibilistic clustering approaches can be implemented on the base of the presented spiking neural network.
Resumo:
Background/aims - To determine which biometric parameters provide optimum predictive power for ocular volume. Methods - Sixty-seven adult subjects were scanned with a Siemens 3-T MRI scanner. Mean spherical error (MSE) (D) was measured with a Shin-Nippon autorefractor and a Zeiss IOLMaster used to measure (mm) axial length (AL), anterior chamber depth (ACD) and corneal radius (CR). Total ocular volume (TOV) was calculated from T2-weighted MRIs (voxel size 1.0 mm3) using an automatic voxel counting and shading algorithm. Each MR slice was subsequently edited manually in the axial, sagittal and coronal plane, the latter enabling location of the posterior pole of the crystalline lens and partitioning of TOV into anterior (AV) and posterior volume (PV) regions. Results - Mean values (±SD) for MSE (D), AL (mm), ACD (mm) and CR (mm) were −2.62±3.83, 24.51±1.47, 3.55±0.34 and 7.75±0.28, respectively. Mean values (±SD) for TOV, AV and PV (mm3) were 8168.21±1141.86, 1099.40±139.24 and 7068.82±1134.05, respectively. TOV showed significant correlation with MSE, AL, PV (all p<0.001), CR (p=0.043) and ACD (p=0.024). Bar CR, the correlations were shown to be wholly attributable to variation in PV. Multiple linear regression indicated that the combination of AL and CR provided optimum R2 values of 79.4% for TOV. Conclusion - Clinically useful estimations of ocular volume can be obtained from measurement of AL and CR.
Resumo:
We overview our recent results on polarisation dynamics of vector solitons in erbium doped fibre laser mode locked with carbon nanotubes. Our experimental and theoretical study revealed new families of vector solitons for fundamental and bound-state soliton operations. The observed scenario of the evolution of the states of polarisation (SOPs) on the Poincare sphere includes fast polarisation switching between two and three SOPs along with slow SOP evolution on a double scroll chaotic attractor. The underlying physics presents an interplay between effects of birefringence of the laser cavity and light induced anisotropy caused by polarisation hole burning. © 2014 IEEE.
Resumo:
We describe an approach for recovering the plaintext in block ciphers having a design structure similar to the Data Encryption Standard but with improperly constructed S-boxes. The experiments with a backtracking search algorithm performing this kind of attack against modified DES/Triple-DES in ECB mode show that the unknown plaintext can be recovered with a small amount of uncertainty and this algorithm is highly efficient both in time and memory costs for plaintext sources with relatively low entropy. Our investigations demonstrate once again that modifications resulting to S-boxes which still satisfy some design criteria may lead to very weak ciphers. ACM Computing Classification System (1998): E.3, I.2.7, I.2.8.
Resumo:
The connectivity between the fish community of estuarine mangroves and that of freshwater habitats upstream remains poorly understood. In the Florida Everglades, mangrove-lined creeks link freshwater marshes to estuarine habitats downstream and may act as dry-season refuges for freshwater fishes. We examined seasonal dynamics in the fish community of ecotonal creeks in the southwestern region of Everglades National Park, specifically Rookery Branch and the North and watson rivers. Twelve low-order creeks were sampled via electrofishing, gill nets, and minnow traps during the wet season, transition period, and dry season in 2004-2005. Catches were greater in Rookery Branch than in the North and watson rivers, particularly during the transition period. Community composition varied seasonally in Rookery Branch, and to a greater extent for the larger species, reflecting a pulse of freshwater taxa into creeks as marshes upstream dried periodically. The pulse was short-lived, a later sample showed substantial decreases in freshwater fish numbers. No evidence of a similar influx was seen in the North and watson rivers, which drain shorter hydroperiod marshes and exhibit higher salinities. These results suggest that head-water creeks can serve as important dry-season refugia. Increased freshwater flow resulting from Everglades restoration may enhance this connectivity.
Resumo:
The aim of this thesis is to merge two of the emerging paradigms about web programming: RESTful Web Development and Service-Oriented Programming. REST is the main architectural paradigm about web applications, they are characterised by procedural structure which avoid the use of handshaking mechanisms. Even though REST has a standard structure to access the resources of the web applications, the backend side is usually not very modular if not complicated. Service-Oriented Programming, instead, has as one of the fundamental principles, the modularisation of the components. Service-Oriented Applications are characterised by separate modules that allow to simplify the devel- opment of the web applications. There are very few example of integration between these two technologies: it seems therefore reasonable to merge them. In this thesis the methodologies studied to reach this results are explored through an application that helps to handle documents and notes among several users, called MergeFly. The MergeFly practical case, once that all the specifics had been set, will be utilised in order to develop and handle HTTP requests through SOAP. In this document will be first defined the 1) characteristics of the application, 2) SOAP technology, partially introduced the 3) Jolie Language, 4) REST and finally a 5) Jolie-REST implementation will be offered through the MergeFly case. It is indeed implemented a token mechanism for authentication: it has been first discarded sessions and cookies algorithm of authentication in so far not into the pure RESTness theory, even if often used). In the final part the functionality and effectiveness of the results will be evaluated, judging the Jolie-REST duo.
Resumo:
Psychology is a relatively new scientific branch and still lacks consistent methodological foundation to support its investigations. Given its immaturity, this science finds difficulties to delimit its ontological status, which spawnes several epistemological and methodological misconceptions. Given this, Psychology failed to demarcate precisely its object of study, leading, thus, the emergence of numerous conceptions about the psychic, which resulted in the fragmentation of this science. In its constitution, psychological science inherited a complex philosophical problem: the mind-body issue. Therefore, to define their status, Psychology must still face this problem, seeking to elucidate what is the mind, the body and how they relate. In light of the importance of this issue to a strict demarcation of psychological object, it was sought in this research, to investigate the mind-body problem in the Phenomenological Psychology of Edith Stein (1891-1942), phenomenologist philosopher who undertook efforts for a foundation of Psychology. For that, the discussion was subsidized from the contributions of the Philosophy of Mind and the support of the phenomenological method to the mind-body problem. From there, by a qualitative bibliographical methodology, it sought to examine the problem of research through the analysis of some philosophical-psychological philosopher's works, named: "Psychic Causality” (Kausalität Psychische, 1922) and “Introduction to Philosophy" (Einführung in die Philosophie, 1920). For this investigation, it was made, without prejudice to the discussion, a terminological equivalence between the terms mind and psyche, as the philosopher used the latter to refer to the object of Psychology. It sought to examine, therefore, how Stein conceived the psyche, the body and the relationship between them. Although it wasn't the focus of the investigation, it also took into account the spiritual dimension, as the philosopher conceived the human person as consisting of three dimensions: body, psyche and spirit. Given this, Stein highlighted the causal mechanism of the psyche, which is based on the variations of the vital force that emerges from the vital sphere. In relation to the corporeal dimension, the philosopher, following the analysis of Edmund Husserl (1859-1938), highlighted the dual aspect of the body, because it is at the same time something material (Körper) and also a linving body (Leib). On the face of it, it is understood that the psyche and the body are closely connected, so that it constitutes a dual-unit which is manifested in the Leib. This understanding of the problem psyche-mind/body provides a rich analysis of this issue, enabling the overcoming of some inconsistencies of the monistic and dualistic positions. Given this, it allows a strict elucidation of the Psychology object, contributing to the foundation of this science.
Resumo:
Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.
A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.
Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.
The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).
First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.
Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.
Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.
The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.
To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.
The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.
The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.
Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.
The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.
In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.
Resumo:
La diminution des doses administrées ou même la cessation complète d'un traitement chimiothérapeutique est souvent la conséquence de la réduction du nombre de neutrophiles, qui sont les globules blancs les plus fréquents dans le sang. Cette réduction dans le nombre absolu des neutrophiles, aussi connue sous le nom de myélosuppression, est précipitée par les effets létaux non spécifiques des médicaments anti-cancéreux, qui, parallèlement à leur effet thérapeutique, produisent aussi des effets toxiques sur les cellules saines. Dans le but d'atténuer cet impact myélosuppresseur, on administre aux patients un facteur de stimulation des colonies de granulocytes recombinant humain (rhG-CSF), une forme exogène du G-CSF, l'hormone responsable de la stimulation de la production des neutrophiles et de leurs libération dans la circulation sanguine. Bien que les bienfaits d'un traitement prophylactique avec le G-CSF pendant la chimiothérapie soient bien établis, les protocoles d'administration demeurent mal définis et sont fréquemment déterminés ad libitum par les cliniciens. Avec l'optique d'améliorer le dosage thérapeutique et rationaliser l'utilisation du rhG-CSF pendant le traitement chimiothérapeutique, nous avons développé un modèle physiologique du processus de granulopoïèse, qui incorpore les connaissances actuelles de pointe relatives à la production des neutrophiles des cellules souches hématopoïétiques dans la moelle osseuse. À ce modèle physiologique, nous avons intégré des modèles pharmacocinétiques/pharmacodynamiques (PK/PD) de deux médicaments: le PM00104 (Zalypsis®), un médicament anti-cancéreux, et le rhG-CSF (filgrastim). En se servant des principes fondamentaux sous-jacents à la physiologie, nous avons estimé les paramètres de manière exhaustive sans devoir recourir à l'ajustement des données, ce qui nous a permis de prédire des données cliniques provenant de 172 patients soumis au protocol CHOP14 (6 cycles de chimiothérapie avec une période de 14 jours où l'administration du rhG-CSF se fait du jour 4 au jour 13 post-chimiothérapie). En utilisant ce modèle physio-PK/PD, nous avons démontré que le nombre d'administrations du rhG-CSF pourrait être réduit de dix (pratique actuelle) à quatre ou même trois administrations, à condition de retarder le début du traitement prophylactique par le rhG-CSF. Dans un souci d'applicabilité clinique de notre approche de modélisation, nous avons investigué l'impact de la variabilité PK présente dans une population de patients, sur les prédictions du modèle, en intégrant des modèles PK de population (Pop-PK) des deux médicaments. En considérant des cohortes de 500 patients in silico pour chacun des cinq scénarios de variabilité plausibles et en utilisant trois marqueurs cliniques, soient le temps au nadir des neutrophiles, la valeur du nadir, ainsi que l'aire sous la courbe concentration-effet, nous avons établi qu'il n'y avait aucune différence significative dans les prédictions du modèle entre le patient-type et la population. Ceci démontre la robustesse de l'approche que nous avons développée et qui s'apparente à une approche de pharmacologie quantitative des systèmes (QSP). Motivés par l'utilisation du rhG-CSF dans le traitement d'autres maladies, comme des pathologies périodiques telles que la neutropénie cyclique, nous avons ensuite soumis l'étude du modèle au contexte des maladies dynamiques. En mettant en évidence la non validité du paradigme de la rétroaction des cytokines pour l'administration exogène des mimétiques du G-CSF, nous avons développé un modèle physiologique PK/PD novateur comprenant les concentrations libres et liées du G-CSF. Ce nouveau modèle PK a aussi nécessité des changements dans le modèle PD puisqu’il nous a permis de retracer les concentrations du G-CSF lié aux neutrophiles. Nous avons démontré que l'hypothèse sous-jacente de l'équilibre entre la concentration libre et liée, selon la loi d'action de masse, n'est plus valide pour le G-CSF aux concentrations endogènes et mènerait en fait à la surestimation de la clairance rénale du médicament. En procédant ainsi, nous avons réussi à reproduire des données cliniques obtenues dans diverses conditions (l'administration exogène du G-CSF, l'administration du PM00104, CHOP14). Nous avons aussi fourni une explication logique des mécanismes responsables de la réponse physiologique aux deux médicaments. Finalement, afin de mettre en exergue l’approche intégrative en pharmacologie adoptée dans cette thèse, nous avons démontré sa valeur inestimable pour la mise en lumière et la reconstruction des systèmes vivants complexes, en faisant le parallèle avec d’autres disciplines scientifiques telles que la paléontologie et la forensique, où une approche semblable a largement fait ses preuves. Nous avons aussi discuté du potentiel de la pharmacologie quantitative des systèmes appliquées au développement du médicament et à la médecine translationnelle, en se servant du modèle physio-PK/PD que nous avons mis au point.
Resumo:
Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.