959 resultados para efficient algorithms
Resumo:
The goal of this work is to try to create a statistical model, based only on easily computable parameters from the CSP problem to predict runtime behaviour of the solving algorithms, and let us choose the best algorithm to solve the problem. Although it seems that the obvious choice should be MAC, experimental results obtained so far show, that with big numbers of variables, other algorithms perfom much better, specially for hard problems in the transition phase.
Resumo:
ZnO nanorods grown by both high temperature vapour phase transport and low temperature chemical bath deposition are very promising sources for UV third harmonic generation. Material grown by both methods show comparable efficiencies, in both cases an order of magnitude higher than surface third harmonic generation at the quartz-air interface of a bare quartz substrate. This result is in stark contrast to the linear optical properties of ZnO nanorods grown by these two methods, which show vastly different PL efficiencies. The third harmonic generated signal is analysed using intensity dependent measurements and interferometric frequency resolved optical gating, allowing extraction of the laser pulse parameters. The comparable levels of efficiency of ZnO grown by these very different methods as sources for third harmonic UV generation provides a broad suite of possible growth methods to suit various substrates, coverage and scalability requirements. Potential application areas range from interferometric frequency resolved optical gating characterization of few cycle fs pulses to single cell UV irradiation for biophysical studies.
Resumo:
In this paper we design and develop several filtering strategies for the analysis of data generated by a resonant bar gravitational wave (GW) antenna, with the goal of assessing the presence (or absence) therein of long-duration monochromatic GW signals, as well as the eventual amplitude and frequency of the signals, within the sensitivity band of the detector. Such signals are most likely generated in the fast rotation of slightly asymmetric spinning stars. We develop practical procedures, together with a study of their statistical properties, which will provide us with useful information on the performance of each technique. The selection of candidate events will then be established according to threshold-crossing probabilities, based on the Neyman-Pearson criterion. In particular, it will be shown that our approach, based on phase estimation, presents a better signal-to-noise ratio than does pure spectral analysis, the most common approach.
Resumo:
BACKGROUND: Left atrial (LA) dilatation is associated with a large variety of cardiac diseases. Current cardiovascular magnetic resonance (CMR) strategies to measure LA volumes are based on multi-breath-hold multi-slice acquisitions, which are time-consuming and susceptible to misregistration. AIM: To develop a time-efficient single breath-hold 3D CMR acquisition and reconstruction method to precisely measure LA volumes and function. METHODS: A highly accelerated compressed-sensing multi-slice cine sequence (CS-cineCMR) was combined with a non-model-based 3D reconstruction method to measure LA volumes with high temporal and spatial resolution during a single breath-hold. This approach was validated in LA phantoms of different shapes and applied in 3 patients. In addition, the influence of slice orientations on accuracy was evaluated in the LA phantoms for the new approach in comparison with a conventional model-based biplane area-length reconstruction. As a reference in patients, a self-navigated high-resolution whole-heart 3D dataset (3D-HR-CMR) was acquired during mid-diastole to yield accurate LA volumes. RESULTS: Phantom studies. LA volumes were accurately measured by CS-cineCMR with a mean difference of -4.73 ± 1.75 ml (-8.67 ± 3.54%, r2 = 0.94). For the new method the calculated volumes were not significantly different when different orientations of the CS-cineCMR slices were applied to cover the LA phantoms. Long-axis "aligned" vs "not aligned" with the phantom long-axis yielded similar differences vs the reference volume (-4.87 ± 1.73 ml vs. -4.45 ± 1.97 ml, p = 0.67) and short-axis "perpendicular" vs. "not-perpendicular" with the LA long-axis (-4.72 ± 1.66 ml vs. -4.75 ± 2.13 ml; p = 0.98). The conventional bi-plane area-length method was susceptible for slice orientations (p = 0.0085 for the interaction of "slice orientation" and "reconstruction technique", 2-way ANOVA for repeated measures). To use the 3D-HR-CMR as the reference for LA volumes in patients, it was validated in the LA phantoms (mean difference: -1.37 ± 1.35 ml, -2.38 ± 2.44%, r2 = 0.97). Patient study: The CS-cineCMR LA volumes of the mid-diastolic frame matched closely with the reference LA volume (measured by 3D-HR-CMR) with a difference of -2.66 ± 6.5 ml (3.0% underestimation; true LA volumes: 63 ml, 62 ml, and 395 ml). Finally, a high intra- and inter-observer agreement for maximal and minimal LA volume measurement is also shown. CONCLUSIONS: The proposed method combines a highly accelerated single-breathhold compressed-sensing multi-slice CMR technique with a non-model-based 3D reconstruction to accurately and reproducibly measure LA volumes and function.
Resumo:
Tämän diplomityön tarkoituksena oli selvittää kustannustehokkaita keinoja uuteaineiden vähentämiseksi koivusulfaattimassasta. Uuteaineet voivat aiheuttaa ongelmia muodostaessaan saostumia prosessilaitteisiin. Saostumat aiheuttavat tukkeumia ja mittaushäiriöitä, mutta irrotessaan ne myös huonontavat sellun laatua. Lopputuotteeseen joutuessaan ne voivat lisäksi aiheuttaa haju- ja makuhaittoja, joilla on erityistä merkitystä esimerkiksi valmistettaessa elintarvikekartonkeja. Tämä työ tehtiin Stora Enson sellutehtaalla, Enocell Oy:llä, Uimaharjussa. Teoriaosassa käsiteltiin uuteaineiden koostumusta ja niiden aiheuttamia ongelmia sellu– ja paperitehtaissa. Lisäksi koottiin aikaisempien tehdaskokeiden fysikaalisia ja kemiallisia keinoja vähentää koivu-uutetta. Tarkastelualueina olivat puunkäsittely, keitto, pesemö ja valkaisu. Kokeellisessa osassa suoritettiin esikokeita laboratorio- ja tehdasmittakaavassa, jotta saavutettaisiin käytännöllistä tietoa itse lopuksi tehtävää tehdasmittakaavan koetta varten. Laboratoriokokeissa tutkittiin mm. keiton kappaluvun, lisäaineiden ja hartsisaippuan vaikutusta koivu-uutteeseen. Lisäksi suoritettiin myös happo- (A) ja peretikkahappovaiheen (Paa) laboratoriokokeet. Tehdasmittakaavassa tarkasteltiin mm. keiton kappaluvun, pesemön lämpötilan, A-vaiheen, valkaisun peroksidi- ja Paa-vaiheen vaikutusta koivu-uutteeseen. Uutteenpoistotehokkuutta eri menetelmien välillä vertailtiin niin määrällisesti kuin rahallisesti. Uutteenpoistotehokkuudella mitattuna vertailuvaihe oli tehokkain pesemön loppuvaiheessa ja valkaisun alkuvaiheessa. Pesemön loppuvaiheessa uutteenpoistoreduktiot olivat noin 30 % ja valkaisun alkuvaiheessa 40 %. Peroksidivaihe oli tehokkain käytettynä valkaisun loppuvaiheessa noin 40 % reduktiolla. Kustannustehokkuudella mitattuna tehokkaimmaksi osoittautui A-vaihe yhdessä peroksidivaiheen kanssa. Säästöt vertailujaksoon verrattuna olivat noin 0.3 €/ADt. Lisäksi kyseinen yhdistelmä osoittautui hyväksi keinoksi säilyttää uutetaso alle maksimirajan kuitulinja 2:lla, kun kuitulinjalla 1 tuotettiin samanaikaisesti armeeraussellua.
Resumo:
Several studies contributed to improving the diagnostic and prognostic assessment of delirium in hospitalized older patients. Direct patient education proved efficient in benzodiazepines withdrawal. A position statement of the American Geriatrics Society does not recommend tube feeding when eating difficulties arise in older persons suffering from advanced dementia. Several studies emphasized once again the potential importance of preventative interventions (in particular physical activity) to prevent or delay dementia occurrence. Two randomized controlled trials of monoclonal antibodies that bind amyloid did not show benefit in patients with mild-to-moderate Alzheimer's dementia (AD). In contrast, vitamin E reduced functional decline in these patients, and citalopram reduced agitation among AD patients as well as their caregiver's stress.
Resumo:
Puhelinmuistio on yksi matkapuhelimen käytetyimmistä ominaisuuksista. Puhelinmuistion tulee siksi olla kaikissa tilanteissa mahdollisimman nopeasti käytettävissä. Tämä edellyttää puhelinmuistiopalvelimelta tehokkaita tietorakenteita ja lajittelualgoritmeja. Nokian matkapuhelimissa puhelinmuistiopalvelin käyttää hakurakenteena järjestettyjä taulukoita. Työn tavoitteena oli kehittää puhelinmuistiopalvelimen hakutaulukoiden lajittelu mahdollisimman nopeaksi. Useita eri lajittelualgoritmeja vertailtiin ja niiden suoritusaikoja analysoitiin eri tilanteissa. Insertionsort-lajittelualgoritmin todettiin olevan nopein algoritmi lähes järjestyksessä olevien taulukoiden lajitteluun. Analyysin perusteella Quicksort-algoritmi lajittelee nopeimmin satunnaisessa järjestyksessä olevat taulukot. Quicksort-insertionsort –hybridialgoritmin havaittiin olevan paras lajittelualgoritmi puhelinmuistion lajitteluun. Sopivalla parametroinnilla tämä algoritmi on nopea satunnaisessa järjestyksessä olevalle aineistolle. Se kykenee hyödyntämään lajiteltavassa aineistossa valmiina olevaa järjestystä. Algoritmi ei kasvata merkittävästi muistinkulutusta. Uuden algoritmin ansiosta hakutaulukoiden lajittelu nopeutuu parhaimmillaan useita kymmeniä prosentteja.
Resumo:
WDM (Wavelength-Division Multiplexing) optiset verkot on tällä hetkellä suosituin tapa isojen määrän tietojen siirtämiseen. Jokaiselle liittymälle määrätään reitin ja aallonpituus joka linkin varten. Tarvittavan reitin ja aallon pituuden löytäminen kutsutaan RWA-ongelmaksi. Tämän työn kuvaa mahdollisia kustannuksen mallein ratkaisuja RWA-ongelmaan. Olemassa on paljon erilaisia optimoinnin tavoitteita. Edellä mainittuja kustannuksen malleja perustuu näillä tavoitteilla. Kustannuksen malleja antavat tehokkaita ratkaisuja ja algoritmeja. The multicommodity malli on käsitelty tässä työssä perusteena RV/A-kustannuksen mallille. Myöskin OB käsitelty heuristisia menetelmiä RWA-ongelman ratkaisuun. Työn loppuosassa käsitellään toteutuksia muutamalle mallille ja erilaisia mahdollisuuksia kustannuksen mallein parantamiseen.
Resumo:
The variability observed in drug exposure has a direct impact on the overall response to drug. The largest part of variability between dose and drug response resides in the pharmacokinetic phase, i.e. in the dose-concentration relationship. Among possibilities offered to clinicians, Therapeutic Drug Monitoring (TDM; Monitoring of drug concentration measurements) is one of the useful tool to guide pharmacotherapy. TDM aims at optimizing treatments by individualizing dosage regimens based on blood drug concentration measurement. Bayesian calculations, relying on population pharmacokinetic approach, currently represent the gold standard TDM strategy. However, it requires expertise and computational assistance, thus limiting its large implementation in routine patient care. The overall objective of this thesis was to implement robust tools to provide Bayesian TDM to clinician in modern routine patient care. To that endeavour, aims were (i) to elaborate an efficient and ergonomic computer tool for Bayesian TDM: EzeCHieL (ii) to provide algorithms for drug concentration Bayesian forecasting and software validation, relying on population pharmacokinetics (iii) to address some relevant issues encountered in clinical practice with a focus on neonates and drug adherence. First, the current stage of the existing software was reviewed and allows establishing specifications for the development of EzeCHieL. Then, in close collaboration with software engineers a fully integrated software, EzeCHieL, has been elaborated. EzeCHieL provides population-based predictions and Bayesian forecasting and an easy-to-use interface. It enables to assess the expectedness of an observed concentration in a patient compared to the whole population (via percentiles), to assess the suitability of the predicted concentration relative to the targeted concentration and to provide dosing adjustment. It allows thus a priori and a posteriori Bayesian drug dosing individualization. Implementation of Bayesian methods requires drug disposition characterisation and variability quantification trough population approach. Population pharmacokinetic analyses have been performed and Bayesian estimators have been provided for candidate drugs in population of interest: anti-infectious drugs administered to neonates (gentamicin and imipenem). Developed models were implemented in EzeCHieL and also served as validation tool in comparing EzeCHieL concentration predictions against predictions from the reference software (NONMEM®). Models used need to be adequate and reliable. For instance, extrapolation is not possible from adults or children to neonates. Therefore, this work proposes models for neonates based on the developmental pharmacokinetics concept. Patients' adherence is also an important concern for drug models development and for a successful outcome of the pharmacotherapy. A last study attempts to assess impact of routine patient adherence measurement on models definition and TDM interpretation. In conclusion, our results offer solutions to assist clinicians in interpreting blood drug concentrations and to improve the appropriateness of drug dosing in routine clinical practice.
Resumo:
(S)-2-(4-Bromo-2,4"-bithiazole)-1-(tert-butoxycarbonyl)pyrrolidine ((S)-1) was obtained as a single enantiomer and in high yield by means of a two-step modified Hantzsch thiazole synthesis reaction when bromoketone 3 and thioamide (S)-4 were used. Further conversion of (S)-1 into trimethyltin derivative (S)-2 broadens the scope for further cross-coupling reactions.
Resumo:
Yritysidentiteetistä on monta eri näkemystä ja yleisesti hyväksyttyä määritelmää ei ole olemassa. Monia eri näkemyksiä käsitellään tässä tutkielmassa. Vaikka yritysidentiteettiä ei olekaan helppo mitata, on tätä varten kuitenkin kehitetty useampia metodeja. Identiteetin viestintä vaati strategisia päätöksiä ennen kuin viestintää voidaan tehdä. Viestinnän integrointi on avainasemassa identiteetin viestinnässä. Hyvin hoidettu ja kommunikoitu yritysidentiteetti voi johtaa useisiin hyötyihin organisaatiolle. Kuitenkaan nämä hyödyt eivät näy kovin nopeasti, koska yritysidentiteetin viestintä on pitkän tähtäimen prosessi.
Resumo:
Although fetal anatomy can be adequately viewed in new multi-slice MR images, many critical limitations remain for quantitative data analysis. To this end, several research groups have recently developed advanced image processing methods, often denoted by super-resolution (SR) techniques, to reconstruct from a set of clinical low-resolution (LR) images, a high-resolution (HR) motion-free volume. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has been quite attracted by Total Variation energies because of their ability in edge preserving but only standard explicit steepest gradient techniques have been applied for optimization. In a preliminary work, it has been shown that novel fast convex optimization techniques could be successfully applied to design an efficient Total Variation optimization algorithm for the super-resolution problem. In this work, two major contributions are presented. Firstly, we will briefly review the Bayesian and Variational dual formulations of current state-of-the-art methods dedicated to fetal MRI reconstruction. Secondly, we present an extensive quantitative evaluation of our SR algorithm previously introduced on both simulated fetal and real clinical data (with both normal and pathological subjects). Specifically, we study the robustness of regularization terms in front of residual registration errors and we also present a novel strategy for automatically select the weight of the regularization as regards the data fidelity term. Our results show that our TV implementation is highly robust in front of motion artifacts and that it offers the best trade-off between speed and accuracy for fetal MRI recovery as in comparison with state-of-the art methods.
Resumo:
BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.
Resumo:
BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.
Resumo:
This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.