890 resultados para real option analysis
Resumo:
Montréal, Québec se construit vers une forme urbaine compacte, mais il en relève des questionnements quant aux effets sur l’abordabilité et l’accession à la propriété. En tenant compte du processus de la densification urbaine, une enquête sur une série de projets de condominiums immobiliers à travers la ville est menée afin de divulguer les prix des projets nouveaux ou en construction. Au préalable, ceci survole la littérature et les études actuelles portant sur la planification urbaine, notamment celles qui sont reliées au Smart Growth, études dans lesquelles le contexte de densification et de tendances consuméristes à préférer les formes urbaines étalées est mis en évidence. Essentiellement, Moroni (2010) souligne l’approche dichotomique en planification urbaine entre les perspectives «teleocratic» et «nomocratic». La densification montréalaise actuelle contemporaine s’exprime par une multitude de modèles de condos conformes aux nouvelles tendances démographiques et des modes de vie. En s’appuyant sur les critères du programme Accès Condos, sur les critères du SCHL (32% du revenu) et sur le revenu médian des ménages, le niveau d’accessibilité à la propriété d’un condominium peut être mesuré. Les résultats indiquent que selon ces critères, les logements de style condominium, neufs et en construction, sont abordables. L’analyse contribue empiriquement à la littérature en exposant les liens entre les stratégies actuelles de densification urbaine avec l’abordabilité des logements condos. La recherche porte un regard nouveau sur le phénomène condo à Montréal et ses tendances démographiques. La ville est divisée selon le modèle Burgess et la recherche mène un sondage comparatif des prix pour déterminer l’abordabilité. Les résultats suggèrent que les projets condos actuels sont relativement abordables pour les ménages avec un revenu médian et plus, selon Accès Condos.
Resumo:
Le suivi thérapeutique est recommandé pour l’ajustement de la dose des agents immunosuppresseurs. La pertinence de l’utilisation de la surface sous la courbe (SSC) comme biomarqueur dans l’exercice du suivi thérapeutique de la cyclosporine (CsA) dans la transplantation des cellules souches hématopoïétiques est soutenue par un nombre croissant d’études. Cependant, pour des raisons intrinsèques à la méthode de calcul de la SSC, son utilisation en milieu clinique n’est pas pratique. Les stratégies d’échantillonnage limitées, basées sur des approches de régression (R-LSS) ou des approches Bayésiennes (B-LSS), représentent des alternatives pratiques pour une estimation satisfaisante de la SSC. Cependant, pour une application efficace de ces méthodologies, leur conception doit accommoder la réalité clinique, notamment en requérant un nombre minimal de concentrations échelonnées sur une courte durée d’échantillonnage. De plus, une attention particulière devrait être accordée à assurer leur développement et validation adéquates. Il est aussi important de mentionner que l’irrégularité dans le temps de la collecte des échantillons sanguins peut avoir un impact non-négligeable sur la performance prédictive des R-LSS. Or, à ce jour, cet impact n’a fait l’objet d’aucune étude. Cette thèse de doctorat se penche sur ces problématiques afin de permettre une estimation précise et pratique de la SSC. Ces études ont été effectuées dans le cadre de l’utilisation de la CsA chez des patients pédiatriques ayant subi une greffe de cellules souches hématopoïétiques. D’abord, des approches de régression multiple ainsi que d’analyse pharmacocinétique de population (Pop-PK) ont été utilisées de façon constructive afin de développer et de valider adéquatement des LSS. Ensuite, plusieurs modèles Pop-PK ont été évalués, tout en gardant à l’esprit leur utilisation prévue dans le contexte de l’estimation de la SSC. Aussi, la performance des B-LSS ciblant différentes versions de SSC a également été étudiée. Enfin, l’impact des écarts entre les temps d’échantillonnage sanguins réels et les temps nominaux planifiés, sur la performance de prédiction des R-LSS a été quantifié en utilisant une approche de simulation qui considère des scénarios diversifiés et réalistes représentant des erreurs potentielles dans la cédule des échantillons sanguins. Ainsi, cette étude a d’abord conduit au développement de R-LSS et B-LSS ayant une performance clinique satisfaisante, et qui sont pratiques puisqu’elles impliquent 4 points d’échantillonnage ou moins obtenus dans les 4 heures post-dose. Une fois l’analyse Pop-PK effectuée, un modèle structural à deux compartiments avec un temps de délai a été retenu. Cependant, le modèle final - notamment avec covariables - n’a pas amélioré la performance des B-LSS comparativement aux modèles structuraux (sans covariables). En outre, nous avons démontré que les B-LSS exhibent une meilleure performance pour la SSC dérivée des concentrations simulées qui excluent les erreurs résiduelles, que nous avons nommée « underlying AUC », comparée à la SSC observée qui est directement calculée à partir des concentrations mesurées. Enfin, nos résultats ont prouvé que l’irrégularité des temps de la collecte des échantillons sanguins a un impact important sur la performance prédictive des R-LSS; cet impact est en fonction du nombre des échantillons requis, mais encore davantage en fonction de la durée du processus d’échantillonnage impliqué. Nous avons aussi mis en évidence que les erreurs d’échantillonnage commises aux moments où la concentration change rapidement sont celles qui affectent le plus le pouvoir prédictif des R-LSS. Plus intéressant, nous avons mis en exergue que même si différentes R-LSS peuvent avoir des performances similaires lorsque basées sur des temps nominaux, leurs tolérances aux erreurs des temps d’échantillonnage peuvent largement différer. En fait, une considération adéquate de l'impact de ces erreurs peut conduire à une sélection et une utilisation plus fiables des R-LSS. Par une investigation approfondie de différents aspects sous-jacents aux stratégies d’échantillonnages limités, cette thèse a pu fournir des améliorations méthodologiques notables, et proposer de nouvelles voies pour assurer leur utilisation de façon fiable et informée, tout en favorisant leur adéquation à la pratique clinique.
Resumo:
Notre mémoire cherche à étudier la poétique de l’espace qui articule le roman Nadie me verá llorar publié en 1999 par l’écrivaine mexicaine contemporaine Cristina Rivera Garza. En inscrivant sa démarche romanesque dans la perspective postmoderne d’une nouvelle histoire culturelle, Rivera Garza dépeint un moment fondamental de l’histoire du Mexique allant de la fin du Porfiriato jusqu’aux lendemains de la Révolution mexicaine en l’incarnant dans le destin des laissés pour compte. Ce faisant, elle présente un texte où une multitude de récits se fondent et se confondent en un tout complexe où sont mis en perspective une série d’espaces de nature ambigüe. Notre analyse tâche d’expliquer cette interrelation des chronotopes de l’Histoire et du privé en tenant compte de son impact sur la structure narrative. En décrivant les différentes modalités des espaces évoqués dans l’oeuvre, nous nous intéressons au type de relations qui unit l’ensemble de ces espaces au grand temps de l’Histoire officielle mexicaine en démontrant que tous ces éléments sont régis par une politique hétérotopique qui lézarde le fini du discours officiel en y insérant un ensemble d’éléments qui le subvertissent. L’identification et la description de cette stratégie discursive est pertinente dans la mesure où elle offre un éclairage autre sur le roman et semble caractériser l’ensemble des oeuvres de Cristina Rivera Garza.
Resumo:
Triple quadrupole mass spectrometers coupled with high performance liquid chromatography are workhorses in quantitative bioanalyses. It provides substantial benefits including reproducibility, sensitivity and selectivity for trace analysis. Selected Reaction Monitoring allows targeted assay development but data sets generated contain very limited information. Data mining and analysis of non-targeted high-resolution mass spectrometry profiles of biological samples offer the opportunity to perform more exhaustive assessments, including quantitative and qualitative analysis. The objectives of this study was to test method precision and accuracy, statistically compare bupivacaine drug concentration in real study samples and verify if high resolution and accurate mass data collected in scan mode can actually permit retrospective data analysis, more specifically, extract metabolite related information. The precision and accuracy data presented using both instruments provided equivalent results. Overall, the accuracy was ranging from 106.2 to 113.2% and the precision observed was from 1.0 to 3.7%. Statistical comparisons using a linear regression between both methods reveal a coefficient of determination (R2) of 0.9996 and a slope of 1.02 demonstrating a very strong correlation between both methods. Individual sample comparison showed differences from -4.5% to 1.6% well within the accepted analytical error. Moreover, post acquisition extracted ion chromatograms at m/z 233.1648 ± 5 ppm (M-56) and m/z 305.2224 ± 5 ppm (M+16) revealed the presence of desbutyl-bupivacaine and three distinct hydroxylated bupivacaine metabolites. Post acquisition analysis allowed us to produce semiquantitative evaluations of the concentration-time profiles for bupicavaine metabolites.
Resumo:
Lorsque les ouragans entrent en contact avec l'environnement bâti et naturel, les autorités publiques n'ont parfois d'autre choix que de déclarer l'évacuation obligatoire de la population située en zone à risque. En raison de l'imprévisibilité du déroulement d'une catastrophe et des comportements humains, les opérations d'évacuation sont confrontées à une incertitude significative. Les expériences passées ont montré que les technologies de l'information et des communications (TIC) ont le potentiel d'améliorer l'état de l'art en gestion des évacuations. Malgré cette reconnaissance, les recherches empiriques sur ce sujet sont à ce jour limitées. La présente étude de cas de la ville de New York explore comment l'intégration des TIC dans la planification opérationnelle des organisations ayant des responsabilités en matière de transport peut améliorer leurs réponses aux événements et influencer le succès global du système de gestion des catastrophes. L'analyse est basée sur les informations recueillies au moyen d'entretiens semi-dirigés avec les organisations de transport et de gestion des catastrophes de la ville de New York ainsi qu’avec des experts du milieu universitaire. Les résultats mettent en lumière le potentiel des TIC pour la prise de décision en interne. Même s’il est largement reconnu que les TIC sont des moyens efficaces d'échanger de l'information en interne et entre les organisations, ces usages sont confrontés à certaines contraintes technologique, organisationnelle, structurelle et systémique. Cette observation a permis d'identifier les contraintes vécues dans les pratiques usuelles de gestion des systèmes urbains.
Resumo:
By the end of 2004, the Canadian swine population had experienced a severe 2 increase in the incidence of Porcine circovirus-associated disease (PCVAD), a problem that was 3 associated with the emergence of a new Porcine circovirus-2 genotype (PCV-2b), previously 4 unrecovered in North America. Thus it became important to develop a diagnostic tool that could 5 differentiate between the old and new circulating genotypes (PCV-2a and -2b, respectively). 6 Consequently, a multiplex real-time quantitative polymerase chain reaction (mrtqPCR) assay that 7 could sensitively and specifically identify and differentiate PCV-2 genotypes was developed. A 8 retrospective epidemiological survey that used the mrtqPCR assay was performed to determine if 9 cofactors could affect the risk of PCVAD. From 121 PCV-2–positive cases gathered for this 10 study, 4.13%, 92.56% and 3.31% were positive for PCV-2a, PCV-2b, and both genotypes, 11 respectively. In a data analysis using univariate logistic regressions, PCVAD compatible 12 (PCVAD/c) score was significantly associated with the presence of Porcine reproductive and 13 respiratory syndrome virus (PRRSV), PRRSV viral load, PCV-2 viral load, and PCV-2 14 immunohistochemistry (IHC) results. Polytomous logistic regression analysis revealed that 15 PCVAD/c score was affected by PCV-2 viral load (P = 0.0161) and IHC (P = 0.0128), but not by 16 the PRRSV variables (P > 0.9); suggesting that mrtqPCR in tissue is a reliable alternative to IHC. 17 Logistic regression analyses revealed that PCV-2 increased the odds ratio of isolating 2 major 18 swine pathogens of the respiratory tract, Actinobacillus pleuropneumoniae and Streptococcus 19 suis serotypes 1/2, 1, 2, 3, 4, and 7, which are serotypes commonly associated with clinical 20 diseases.
Resumo:
Travail dirigé présenté à la Faculté des arts et sciences en vue de l'obtention du grade de Maîtrise en criminologie, option criminalistique et information.
Resumo:
Study Design. Reliability study. Objectives. To assess between-acquisition reliability of new multilevel trunk cross sections measurements, in order to define what is a real change when comparing 2 trunk surface acquisitions of a same patient, before and after surgery or throughout the clinical monitoring. Summary of Background Data. Several cross-sectional surface measurements have been proposed in the literature for noninvasive assessment of trunk deformity in patients with adolescent idiopathic scoliosis (AIS). However, only the maximum values along the trunk are evaluated and used for monitoring progression and assessing treatment outcome. Methods. Back surface rotation (BSR), trunk rotation (TR), and coronal and sagittal trunk deviation are computed on 300 cross sections of the trunk. Each set of 300 measures is represented as a single functional data, using a set of basis functions. To evaluate between-acquisition variability at all trunk levels, a test-retest reliability study is conducted on 35 patients with AIS. A functional correlation analysis is also carried out to evaluate any redundancy between the measurements. Results. Each set of 300 measures was successfully described using only 10 basis functions. The test-retest reliability of the functional measurements is good to very good all over the trunk, except above the shoulders level. The typical errors of measurement are between 1.20° and 2.2° for the rotational measures and between 2 and 6 mm for deviation measures. There is a very strong correlation between BSR and TR all over the trunk, a moderate correlation between coronal trunk deviation and both BSR and TR, and no correlation between sagittal trunk deviation and any other measurement. Conclusion. This novel representation of trunk surface measurements allows for a global assessment of trunk surface deformity. Multilevel trunk measurements provide a broader perspective of the trunk deformity and allow a reliable multilevel monitoring during clinical follow-up of patients with AIS and a reliable assessment of the esthetic outcome after surgery.
Resumo:
The thesis deals with analysis of some Stochastic Inventory Models with Pooling/Retrial of Customers.. In the first model we analyze an (s,S) production Inventory system with retrial of customers. Arrival of customers from outside the system form a Poisson process. The inter production times are exponentially distributed with parameter µ. When inventory level reaches zero further arriving demands are sent to the orbit which has capacity M(<∞). Customers, who find the orbit full and inventory level at zero are lost to the system. Demands arising from the orbital customers are exponentially distributed with parameter γ. In the model-II we extend these results to perishable inventory system assuming that the life-time of each item follows exponential with parameter θ. The study deals with an (s,S) production inventory with service times and retrial of unsatisfied customers. Primary demands occur according to a Markovian Arrival Process(MAP). Consider an (s,S)-retrial inventory with service time in which primary demands occur according to a Batch Markovian Arrival Process (BMAP). The inventory is controlled by the (s,S) policy and (s,S) inventory system with service time. Primary demands occur according to Poissson process with parameter λ. The study concentrates two models. In the first model we analyze an (s,S) Inventory system with postponed demands where arrivals of demands form a Poisson process. In the second model, we extend our results to perishable inventory system assuming that the life-time of each item follows exponential distribution with parameter θ. Also it is assumed that when inventory level is zero the arriving demands choose to enter the pool with probability β and with complementary probability (1- β) it is lost for ever. Finally it analyze an (s,S) production inventory system with switching time. A lot of work is reported under the assumption that the switching time is negligible but this is not the case for several real life situation.
Resumo:
A new compact micro strip antenna element is analyzed. The analysis can accurately predict the resonant frequency, input impedance, and radiation patterns. The predicted results are compared with experimental results and excellent agreement is observed . These antenna elements are more suitable in applications where limited antenna real estate is available
Resumo:
A new compact microstrip antenna element is analyzed. The analysis can accurately predict the resonant frequency, input impedance, and radiation patterns. The predicted results are compared with experimental results and excellent agreement is observed . These antenna elements are more suitable in applications where limited antenna real estate is available
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.
Resumo:
In this thesis, the applications of the recurrence quantification analysis in metal cutting operation in a lathe, with specific objective to detect tool wear and chatter, are presented.This study is based on the discovery that process dynamics in a lathe is low dimensional chaotic. It implies that the machine dynamics is controllable using principles of chaos theory. This understanding is to revolutionize the feature extraction methodologies used in condition monitoring systems as conventional linear methods or models are incapable of capturing the critical and strange behaviors associated with the metal cutting process.As sensor based approaches provide an automated and cost effective way to monitor and control, an efficient feature extraction methodology based on nonlinear time series analysis is much more demanding. The task here is more complex when the information has to be deduced solely from sensor signals since traditional methods do not address the issue of how to treat noise present in real-world processes and its non-stationarity. In an effort to get over these two issues to the maximum possible, this thesis adopts the recurrence quantification analysis methodology in the study since this feature extraction technique is found to be robust against noise and stationarity in the signals.The work consists of two different sets of experiments in a lathe; set-I and set-2. The experiment, set-I, study the influence of tool wear on the RQA variables whereas the set-2 is carried out to identify the sensitive RQA variables to machine tool chatter followed by its validation in actual cutting. To obtain the bounds of the spectrum of the significant RQA variable values, in set-i, a fresh tool and a worn tool are used for cutting. The first part of the set-2 experiments uses a stepped shaft in order to create chatter at a known location. And the second part uses a conical section having a uniform taper along the axis for creating chatter to onset at some distance from the smaller end by gradually increasing the depth of cut while keeping the spindle speed and feed rate constant.The study concludes by revealing the dependence of certain RQA variables; percent determinism, percent recurrence and entropy, to tool wear and chatter unambiguously. The performances of the results establish this methodology to be viable for detection of tool wear and chatter in metal cutting operation in a lathe. The key reason is that the dynamics of the system under study have been nonlinear and the recurrence quantification analysis can characterize them adequately.This work establishes that principles and practice of machining can be considerably benefited and advanced from using nonlinear dynamics and chaos theory.
Resumo:
Timely detection of sudden change in dynamics that adversely affect the performance of systems and quality of products has great scientific relevance. This work focuses on effective detection of dynamical changes of real time signals from mechanical as well as biological systems using a fast and robust technique of permutation entropy (PE). The results are used in detecting chatter onset in machine turning and identifying vocal disorders from speech signal.Permutation Entropy is a nonlinear complexity measure which can efficiently distinguish regular and complex nature of any signal and extract information about the change in dynamics of the process by indicating sudden change in its value. Here we propose the use of permutation entropy (PE), to detect the dynamical changes in two non linear processes, turning under mechanical system and speech under biological system.Effectiveness of PE in detecting the change in dynamics in turning process from the time series generated with samples of audio and current signals is studied. Experiments are carried out on a lathe machine for sudden increase in depth of cut and continuous increase in depth of cut on mild steel work pieces keeping the speed and feed rate constant. The results are applied to detect chatter onset in machining. These results are verified using frequency spectra of the signals and the non linear measure, normalized coarse-grained information rate (NCIR).PE analysis is carried out to investigate the variation in surface texture caused by chatter on the machined work piece. Statistical parameter from the optical grey level intensity histogram of laser speckle pattern recorded using a charge coupled device (CCD) camera is used to generate the time series required for PE analysis. Standard optical roughness parameter is used to confirm the results.Application of PE in identifying the vocal disorders is studied from speech signal recorded using microphone. Here analysis is carried out using speech signals of subjects with different pathological conditions and normal subjects, and the results are used for identifying vocal disorders. Standard linear technique of FFT is used to substantiate thc results.The results of PE analysis in all three cases clearly indicate that this complexity measure is sensitive to change in regularity of a signal and hence can suitably be used for detection of dynamical changes in real world systems. This work establishes the application of the simple, inexpensive and fast algorithm of PE for the benefit of advanced manufacturing process as well as clinical diagnosis in vocal disorders.