886 resultados para real option analysis
Resumo:
This paper describes a realistic simulator for the Computed Tomography (CT) scan process for motion analysis. In fact, we are currently developing a new framework to find small motion from the CT scan. In order to prove the fidelity of this framework, or potentially any other algorithm, we present in this paper a simulator to simulate the whole CT acquisition process with a priori known parameters. In other words, it is a digital phantom for the motion analysis that can be used to compare the results of any related algorithm with the ground-truth realistic analytical model. Such a simulator can be used by the community to test different algorithms in the biomedical imaging domain. The most important features of this simulator are its different considerations to simulate the best the real acquisition process and its generality.
Resumo:
The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.
Resumo:
This past winter the sieve analysis of combined aggregate was investigated. This study was given No. 26 by the Central Laboratory. The purpose of this work was to try and develop a sieve analysis procedure for combined aggregate which is less time consuming and has the same accuracy as the method described in I.M. 304. In an attempt to use a variety of aggregates for this investigation, a request was made to each District Materials Office to obtain at least 3 different combined aggregate samples in their respective districts. At the same time it was also requested that the field technician test these samples, prior to submitting them to the Central Laboratory. The field technician was instructed to test each sample as described in method I.M. 304 and also by a modified AASHTO T27 method which will be identified in the report as Method A. The modified AASHTO Method A was identical to T27 with the exception that a smaller sample is used for testing. The field technicians submitted the samples, test results and also comments regarding the modified AASHTO procedure. The general comments of the modified AASHTO procedure were: The method was much simpler to follow; however, it took about the same amount of time so there was no real advantage. After reviewing AASHTO T27, T164, I.M. 304 and Report No. FHWA-RD-77-53 another test method was purposed. Report No. FHWA-RD-77-53 is a report prepared by FHWA from data they gathered concerning control practices and shortcut or alternative test methods for aggregate gradation. A second test method was developed which also was very similar to AASHTO T27, The test procedure for this method is attached and is identified as Method B. The following is a summary of test results submitted by the Field Technicians and obtained by the aggregate section of the Central Laboratory.
Resumo:
Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.
Resumo:
PURPOSE: The aim of this study was to determine outcomes of total hip replacement (THR) with the Lemania cemented femoral stem. METHODS: A total of 78 THR patients were followed and compared to 17 "fit", healthy, elderly and 72 "frail" elderly subjects without THR, using clinical outcome measures and a portable, in-field gait analysis device at five and ten years follow-up. RESULTS: Forty-one patients (53%), mean age 83.4 years, available at ten years follow-up, reported very good to excellent satisfaction. Mean Harris Hip and Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) scores were 81.2 and 10.5 points, respectively, with excellent radiological preservation of proximal femur bone stock. Spatial and temporal gait parameters were close to the fit group and better than the frail group. CONCLUSIONS: Lemania THR demonstrated very good, stable clinical and radiological results at ten years in an older patient group, comparable to other cemented systems for primary THR. Gait analysis confirmed good walking performance in a real-life environment.
Resumo:
Background: Mantle cell lymphoma (MCL) is a rare subtype (3-9%) of Non Hodgkin Lymphoma (NHL) with a relatively poor prognosis (5-year survival < 40%). Although consolidation of first remission with autologous stem cell transplantation (ASCT) is regarded as "golden standard", less than half of the patients may be subjected to this intensive treatment due to advanced age and co-morbidities. Standard-dose non-myeloablative radioimmunotherapy (RIT) seems to be a very efficient approach for treatment of certain NHL. However, there are almost no data available on the efficacy and safety of RIT in MCL. Methods and Patients: In the RIT-Network, a web-based international registry collecting real observational data from RIT-treated patients, 115 MCL patients treated with ibritumomab tiuxetan were recorded. Most of the patients were elderly males with advanced stage of the disease: median age - 63 (range 31-78); males - 70.4%, stage III/IV - 92%. RIT (i.e. application of ibritumomab tiuxetan) was a part of the first line therapy in 48 pts. (43%). Further 38 pts. (33%) received ibritumomab tiuxetan after two previous chemotherapy regimens, and 33 pts. (24%) after completing 3-8 lines. In 75 cases RIT was applied as a consolidation of chemotherapy induced response; the rest of the patients received ibritumomab tiuxetan because of relapse/refractory disease. At the moment follow up data are available for 74 MCL patients. Results: After RIT the patients achieved high response rate: CR 60.8%, PR 25.7%, and SD 2.7%. Only 10.8% of the patients progressed. For survival analysis many data had to be censored since the documentation had not been completed yet. The projected 3-year overall survival (OAS, fig.1 - image 001.gif) after radioimmunotherapy was 72% for pts. subjected to RIT consolidation versus 29% for those treated in relapse/refractory disease (p=0.03). RIT was feasible for almost all patients; only 3 procedure-related deaths were reported in the whole group. The main adverse event was hematological toxicity (grade III/IV cytopenias) showing a median time of recovery of Hb, WBC and Plt of 45, 40 and 38 days respectively. Conclusion: Standard-dose non-myeloablative RIT is a feasible and safe treatment modality, even for elderly MCL pts. Consolidation radioimmunotherapy with ibritumomab tiuxetan may prolong survival of patients who achieved clinical response after chemotherapy. Therefore, this consolidation approach should be considered as a treatment strategy for those, who are not eligible for ASCT. RIT also has a potential role as a palliation therapy in relapsing/resistant cases.
Resumo:
The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.
Resumo:
PURPOSE: The aim of this study was to characterize oligonucleotide-polyethylenimine (ODN/PEI) complex preparation for potential transfection of retinal cells in vitro and in vivo. METHODS: The effect of medium preparation [HEPES-buffered saline (HBS), water] on particle size and morphology was evaluated. Cultured Lewis rat retinal Müller glial (RMG) cells were transfected using fluorescein isothiocyanate (FITC)-ODN/PEI complexes specifically directed at transforming growth factor beta (TGFbeta)-2. Efficacy of transfection was evaluated using confocal microscopy, and regulation of gene expression was assayed using quantitative real-time RT-PCR and ELISA assay. One, 24, and 72 h after injection of FITC-ODN/PEI complexes into the vitreous of rat eyes, their distribution was analyzed on eye sections. RESULTS: Complexes prepared in HBS were smaller than complexes prepared in pure water and presented a core-shell structure. These particles showed a high cellular internalization efficacy, along with a significant and specific down-regulation of TGFbeta-2 expression and production in RMG cells, correlating with specific inhibition of cell growth at 72 h. In vivo, complexes efficiently transfect retinal cells and follow a transretinal migration at 24 h. After 72 h, ODN seems to preferentially target RMG cells without inducing any detectable toxicity. CONCLUSIONS: Specific down-regulation of TGFbeta-2 expression using ODN/PEI complexes may have potential interest for the treatment of retinal diseases associated with glial proliferation.
Resumo:
We present the application of a real-time quantitative PCR assay, previously developed to measure relative telomere length in humans and mice, to two bird species, the zebra finch Taeniopygia guttata and the Alpine swift Apus melba. This technique is based on the PCR amplification of telomeric (TTAGGG)(n) sequences using specific oligonucleotide primers. Relative telomere length is expressed as the ratio (T/S) of telomere repeat copy number (T) to control single gene copy number (S). This method is particularly useful for comparisons of individuals within species, or where the same individuals are followed longitudinally. We used glyceraldehyde-3-phosphate dehydrogenase (GAPDH) as a single control gene. In both species, we validated our PCR measurements of relative telomere length against absolute measurements of telomere length determined by the conventional method of quantifying telomere terminal restriction fragment (TRF) lengths using both the traditional Southern blot analysis (Alpine swifts) and in gel hybridization (zebra finches). As found in humans and mice, telomere lengths in the same sample measured by TRF and PCR were well correlated in both the Alpine swift and the zebra finch.. Hence, this PCR assay for measurement of bird telomeres, which is fast and requires only small amounts of genomic DNA, should open new avenues in the study of environmental factors influencing variation in telomere length, and how this variation translates into variation in cellular and whole organism senescence.
Resumo:
The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.
Resumo:
This work is devoted to the problem of reconstructing the basis weight structure at paper web with black{box techniques. The data that is analyzed comes from a real paper machine and is collected by an o®-line scanner. The principal mathematical tool used in this work is Autoregressive Moving Average (ARMA) modelling. When coupled with the Discrete Fourier Transform (DFT), it gives a very flexible and interesting tool for analyzing properties of the paper web. Both ARMA and DFT are independently used to represent the given signal in a simplified version of our algorithm, but the final goal is to combine the two together. Ljung-Box Q-statistic lack-of-fit test combined with the Root Mean Squared Error coefficient gives a tool to separate significant signals from noise.
Resumo:
Monimutkaisen tietokonejärjestelmän suorituskykyoptimointi edellyttää järjestelmän ajonaikaisen käyttäytymisen ymmärtämistä. Ohjelmiston koon ja monimutkaisuuden kasvun myötä suorituskykyoptimointi tulee yhä tärkeämmäksi osaksi tuotekehitysprosessia. Tehokkaampien prosessorien käytön myötä myös energiankulutus ja lämmöntuotto ovat nousseet yhä suuremmiksi ongelmiksi, erityisesti pienissä, kannettavissa laitteissa. Lämpö- ja energiaongelmien rajoittamiseksi on kehitetty suorituskyvyn skaalausmenetelmiä, jotka edelleen lisäävät järjestelmän kompleksisuutta ja suorituskykyoptimoinnin tarvetta. Tässä työssä kehitettiin visualisointi- ja analysointityökalu ajonaikaisen käyttäytymisen ymmärtämisen helpottamiseksi. Lisäksi kehitettiin suorituskyvyn mitta, joka mahdollistaa erilaisten skaalausmenetelmien vertailun ja arvioimisen suoritusympäristöstä riippumatta, perustuen joko suoritustallenteen tai teoreettiseen analyysiin. Työkalu esittää ajonaikaisesti kerätyn tallenteen helposti ymmärrettävällä tavalla. Se näyttää mm. prosessit, prosessorikuorman, skaalausmenetelmien toiminnan sekä energiankulutuksen kolmiulotteista grafiikkaa käyttäen. Työkalu tuottaa myös käyttäjän valitsemasta osasta suorituskuvaa numeerista tietoa, joka sisältää useita oleellisia suorituskykyarvoja ja tilastotietoa. Työkalun sovellettavuutta tarkasteltiin todellisesta laitteesta saatua suoritustallennetta sekä suorituskyvyn skaalauksen simulointia analysoimalla. Skaalausmekanismin parametrien vaikutus simuloidun laitteen suorituskykyyn analysoitiin.
Resumo:
Given the low sensitivity of amoebal coculture, we developed a specific real-time PCR for the detection of Parachlamydia. The analytical sensitivity was high, and the inter- and intrarun variabilities were low. When the PCR was applied to nasopharyngeal aspirates, it was positive for six patients with bronchiolitis. Future studies should assess the role of Parachlamydia in bronchiolitis.
Resumo:
Tämän kannattavuustutkimuksen lähtökohtana oli se, että Yhtyneet Sahat Oy:n Kaukaan sahalla ja Luumäen jatkojalostuslaitoksella haluttiin selvittää pellettitehtaan kannattavuus nykyisessä markkinatilanteessa. Tämä työon luonteeltaan teknis-taloudellinen selvitys eli ns. feasibility study. Pelletöintiprosessi on tekniikaltaan yksinkertainen eikä edellytä korkea teknologian laitteita. Toimiala on maailmanlaajuisesti varsin uusi. Suomessa pellettimarkkinat ovat vielä pienet ja kehittymättömät, mutta kasvua on viime vuosina tapahtunut. Valtaosa kotimaan tuotannosta menee vientiin. Investoinnin laskentaprosessissa saadut tuotannon alkuarvot sekä kustannusrakenteen määrittelyt ovat perustana varsinaisille kannattavuuslaskelmille. Laskelmista on selvitetty investointeihin liittyvät yleisimmät taloudelliset tunnusluvut ja herkimpiä muuttujia on tutkittu ja pohdittu herkkyysanalyysiä apuna käyttäen.
Resumo:
Tutkielma keskittyy lisäämään investointiarviointiprosessien rationaalisuutta strategisten investointien arvioinnissa duopoli- / oligopolimarkkinoilla. Tutkielman päätavoitteena on selvittää kuinka peliteorialla laajennettu reaalioptioperusteinen investointien arviointimenetelmä, laajennettu reaalioptiokehikko, voisi mahdollisesti parantaa analyysien tarkkuutta. Tutkimus lähestyy ongelmaa investoinnin ajoituksen sekä todellisten investoinnin arvoattribuuttien riippuvuuksien kautta. Laajennettu reaalioptiokehikko on investointien analysointi- ja johtamistyökalu, joka tarjoaa osittain rajoitetun (sisältää tällä hetkellä ainoastaan parametrisen ja peliteoreettisen epävarmuuden) optimaalisen arvovälin investoinnin todellisesta arvosta. Kehikossa, ROA kartoittaa mahdolliset strategiset hyödyt tunnistamalla investointiinliittyvät eri optiot ja epävarmuudet, peliteoria korostaa ympäristön luomia paineita investointiin liittyvän epävarmuuden hallitsemisessa. Laajennettu reaalioptiokehikko tarjoaa rationaalisemman arvion strategisen investoinnin arvosta, koska se yhdistää johdonmukaisemmin option toteutuksen ja siten myös optioiden aika-arvon, yrityksen todellisiin rajoitettuihin (rajoituksena muiden markkinatoimijoiden toimet) polkuriippuvaisiin kyvykkyyksiin.