957 resultados para Speed-up
Resumo:
Protein-ligand docking has made important progress during the last decade and has become a powerful tool for drug development, opening the way to virtual high throughput screening and in silico structure-based ligand design. Despite the flattering picture that has been drawn, recent publications have shown that the docking problem is far from being solved, and that more developments are still needed to achieve high successful prediction rates and accuracy. Introducing an accurate description of the solvation effect upon binding is thought to be essential to achieve this goal. In particular, EADock uses the Generalized Born Molecular Volume 2 (GBMV2) solvent model, which has been shown to reproduce accurately the desolvation energies calculated by solving the Poisson equation. Here, the implementation of the Fast Analytical Continuum Treatment of Solvation (FACTS) as an implicit solvation model in small molecules docking calculations has been assessed using the EADock docking program. Our results strongly support the use of FACTS for docking. The success rates of EADock/FACTS and EADock/GBMV2 are similar, i.e. around 75% for local docking and 65% for blind docking. However, these results come at a much lower computational cost: FACTS is 10 times faster than GBMV2 in calculating the total electrostatic energy, and allows a speed up of EADock by a factor of 4. This study also supports the EADock development strategy relying on the CHARMM package for energy calculations, which enables straightforward implementation and testing of the latest developments in the field of Molecular Modeling.
Resumo:
Principles: Surgeon's experience is crucial for proper application of sentinel node biopsy (SNB) in patients with breast cancer. A 20-30 cases learning curve of sentinel node (SN) and axillary lymph node dissection (ALND) was widely practiced. In order to speed up this learning curve, surgeons may be trained intraoperative by an experienced surgeon. The purpose of this report is to evaluate the results of this procedure. Methods: Patients with one primary invasive breast cancer (cT1-T2[<3 cm]cN0) underwent SNB based on lymphoscintigraphy using technetium Tc 99m colloid, intraoperative gamma probe detection, with or without blue dye mapping. This was followed by completion ALND when SN was positive or not found. SNB was performed by one experienced surgeon (teacher) or by 10 junior surgeons trained by the experienced surgeon (trainees). Four groups were defined: (i) SNB with immediate ALND for the teacher's learning curve, (ii) SNB by the teacher, (iii) SNB by the trainees under the teacher's supervision, and (iv) SNB by the trainees alone. Results: Between May 1999 and December 2007, a total of 808 évaluable patients underwent SNB. The SN identification rate was 98% in the teacher's group, and 99% in the trainees' group (p = 0.196). SN were positive in respectively 28% and 29% of patients (p = 0.196). The distribution of isolated tumor cells, micrometastases and metastases was not statistically different between the teacher's and the trainees' groups (p = 0.163). Conclusion: These comparable results confirm the success with which the SNB was taught. This strategy avoided the 20-30 SNB followed by immediate ALND early required per surgeon.
Resumo:
The propagation of a pulse in a nonlinear array of oscillators is influenced by the nature of the array and by its coupling to a thermal environment. For example, in some arrays a pulse can be speeded up while in others a pulse can be slowed down by raising the temperature. We begin by showing that an energy pulse (one dimension) or energy front (two dimensions) travels more rapidly and remains more localized over greater distances in an isolated array (microcanonical) of hard springs than in a harmonic array or in a soft-springed array. Increasing the pulse amplitude causes it to speed up in a hard chain, leaves the pulse speed unchanged in a harmonic system, and slows down the pulse in a soft chain. Connection of each site to a thermal environment (canonical) affects these results very differently in each type of array. In a hard chain the dissipative forces slow down the pulse while raising the temperature speeds it up. In a soft chain the opposite occurs: the dissipative forces actually speed up the pulse, while raising the temperature slows it down. In a harmonic chain neither dissipation nor temperature changes affect the pulse speed. These and other results are explained on the basis of the frequency vs energy relations in the various arrays
Resumo:
The Hartman effect is analyzed in both the position and momentum representations of the problem. The importance of Wigner tunneling and deep tunneling is singled out. It is shown quantitatively how the barrier acts as a filter for low momenta (quantum speed up) as the width increases, and a detailed mechanism is proposed. Superluminal transmission is also discussed.
Resumo:
Particle fluxes (including major components and grain size), and oceanographic parameters (near-bottom water temperature, current speed and suspended sediment concentration) were measured along the Cap de Creus submarine canyon in the Gulf of Lions (GoL; NW Mediterranean Sea) during two consecutive winter-spring periods (2009 2010 and 2010 2011). The comparison of data obtained with the measurements of meteorological and hydrological parameters (wind speed, turbulent heat flux, river discharge) have shown the important role of atmospheric forcings in transporting particulate matter through the submarine canyon and towards the deep sea. Indeed, atmospheric forcing during 2009 2010 and 2010 2011 winter months showed differences in both intensity and persistence that led to distinct oceanographic responses. Persistent dry northern winds caused strong heat losses (14.2 × 103 W m−2) in winter 2009 2010 that triggered a pronounced sea surface cooling compared to winter 2010 2011 (1.6 × 103 W m−2 lower). As a consequence, a large volume of dense shelf water formed in winter 2009 2010, which cascaded at high speed (up to ∼1 m s−1) down Cap de Creus Canyon as measured by a current-meter in the head of the canyon. The lower heat losses recorded in winter 2010 2011, together with an increased river discharge, resulted in lowered density waters over the shelf, thus preventing the formation and downslope transport of dense shelf water. High total mass fluxes (up to 84.9 g m−2 d−1) recorded in winter-spring 2009 2010 indicate that dense shelf water cascading resuspended and transported sediments at least down to the middle canyon. Sediment fluxes were lower (28.9 g m−2 d−1) under the quieter conditions of winter 2010 2011. The dominance of the lithogenic fraction in mass fluxes during the two winter-spring periods points to a resuspension origin for most of the particles transported down canyon. The variability in organic matter and opal contents relates to seasonally controlled inputs associated with the plankton spring bloom during March and April of both years.
Resumo:
HR-394 was a software and database development project. Via funding provided by the Iowa Highway Research Board, the Iowa County Engineer's Association Service Bureau oversaw the planning and implementation of an Internet based application that supports two major local-government transportation project activities: Project programming and Development tracking. The goals were to reduce errors and inconsistencies, speed up the processes, link people to both project data and each other, and build a framework that could eventually support a 'paperless' work flow. The work started in 1999 and initial development was completed by the fall of 2002. Since going live, several 'piggy back' applications have been required to make the Programming side better fit actual work procedures. This part of the system has proven adequate but will be rewritten in 2004 to make it easier to use. The original development side module was rejected by the users and so had to be rewritten in 2003. The second version has proven much better, is heavily used, and is interconnected with Iowa DOT project data systems. Now that the system is in operation, it will be maintained and operated by the ICEA Service Bureau as an ongoing service function.
Resumo:
Dose kernel convolution (DK) methods have been proposed to speed up absorbed dose calculations in molecular radionuclide therapy. Our aim was to evaluate the impact of tissue density heterogeneities (TDH) on dosimetry when using a DK method and to propose a simple density-correction method. METHODS: This study has been conducted on 3 clinical cases: case 1, non-Hodgkin lymphoma treated with (131)I-tositumomab; case 2, a neuroendocrine tumor treatment simulated with (177)Lu-peptides; and case 3, hepatocellular carcinoma treated with (90)Y-microspheres. Absorbed dose calculations were performed using a direct Monte Carlo approach accounting for TDH (3D-RD), and a DK approach (VoxelDose, or VD). For each individual voxel, the VD absorbed dose, D(VD), calculated assuming uniform density, was corrected for density, giving D(VDd). The average 3D-RD absorbed dose values, D(3DRD), were compared with D(VD) and D(VDd), using the relative difference Δ(VD/3DRD). At the voxel level, density-binned Δ(VD/3DRD) and Δ(VDd/3DRD) were plotted against ρ and fitted with a linear regression. RESULTS: The D(VD) calculations showed a good agreement with D(3DRD). Δ(VD/3DRD) was less than 3.5%, except for the tumor of case 1 (5.9%) and the renal cortex of case 2 (5.6%). At the voxel level, the Δ(VD/3DRD) range was 0%-14% for cases 1 and 2, and -3% to 7% for case 3. All 3 cases showed a linear relationship between voxel bin-averaged Δ(VD/3DRD) and density, ρ: case 1 (Δ = -0.56ρ + 0.62, R(2) = 0.93), case 2 (Δ = -0.91ρ + 0.96, R(2) = 0.99), and case 3 (Δ = -0.69ρ + 0.72, R(2) = 0.91). The density correction improved the agreement of the DK method with the Monte Carlo approach (Δ(VDd/3DRD) < 1.1%), but with a lesser extent for the tumor of case 1 (3.1%). At the voxel level, the Δ(VDd/3DRD) range decreased for the 3 clinical cases (case 1, -1% to 4%; case 2, -0.5% to 1.5%, and -1.5% to 2%). No more linear regression existed for cases 2 and 3, contrary to case 1 (Δ = 0.41ρ - 0.38, R(2) = 0.88) although the slope in case 1 was less pronounced. CONCLUSION: This study shows a small influence of TDH in the abdominal region for 3 representative clinical cases. A simple density-correction method was proposed and improved the comparison in the absorbed dose calculations when using our voxel S value implementation.
Resumo:
Peer-reviewed
Resumo:
BACKGROUND: Several European HIV observational data bases have, over the last decade, accumulated a substantial number of resistance test results and developed large sample repositories, There is a need to link these efforts together, We here describe the development of such a novel tool that allows to bind these data bases together in a distributed fashion for which the control and data remains with the cohorts rather than classic data mergers.METHODS: As proof-of-concept we entered two basic queries into the tool: available resistance tests and available samples. We asked for patients still alive after 1998-01-01, and between 180 and 195 cm of height, and how many samples or resistance tests there would be available for these patients, The queries were uploaded with the tool to a central web server from which each participating cohort downloaded the queries with the tool and ran them against their database, The numbers gathered were then submitted back to the server and we could accumulate the number of available samples and resistance tests.RESULTS: We obtained the following results from the cohorts on available samples/resistance test: EuResist: not availableI11,194; EuroSIDA: 20,71611,992; ICONA: 3,751/500; Rega: 302/302; SHCS: 53,78311,485, In total, 78,552 samples and 15,473 resistance tests were available amongst these five cohorts. Once these data items have been identified, it is trivial to generate lists of relevant samples that would be usefuI for ultra deep sequencing in addition to the already available resistance tests, Saon the tool will include small analysis packages that allow each cohort to pull a report on their cohort profile and also survey emerging resistance trends in their own cohort,CONCLUSIONS: We plan on providing this tool to all cohorts within the Collaborative HIV and Anti-HIV Drug Resistance Network (CHAIN) and will provide the tool free of charge to others for any non-commercial use, The potential of this tool is to ease collaborations, that is, in projects requiring data to speed up identification of novel resistance mutations by increasing the number of observations across multiple cohorts instead of awaiting single cohorts or studies to reach the critical number needed to address such issues.
Resumo:
Työn tavoitteena oli suurnopeuskuvaustekniikan ottaminen osaksi Kemppi Oy:n tuotekehitystä. Työn teoriaosa sisältää hitsausvalokaaren ja lisäaineensiirtymisen teoriaa,sekä perehtymisen suurnopeuskuvaustekniikkaan ja siihen vaadittaviin laitteistoihin. Teoriassa tarkastellaan myös MIG/MAG -päittäisliitoksen olemassa oleviin pohjapalon hitsausprosesseihin. Kokeellisen osan tuloksena syntyi suurnopeuskuvaustekniikkaan pohjautuva tutkimusmenetelmä. Rakennetulla järjestelmällä nähtiin hyvin teoriassa esitetyt MIG/MAG -hitsauksen kaarityypit ja lisäaineensiirtymisen muodot. Kuvattuja tuloksia kyettiin analysoimaan ja vertaamaan teoriaan. Päittäisliitoksen pohjapalon hitsaukseen soveltuvan prosessin kehitys oli osa käytännön osuutta. Tehdyillä kuvauksilla onnistuttiin havainnoimaan kehitetyn valokaaren ja aineensiirtymisen ilmiöitä. Näiden ymmärtäminen helpottui kuvauksien ansiosta ja tämä nopeutti ja mahdollisti tuotekehitystyötä. Tuloksena valmistui uusi pulssitettu lyhytkaariprosessi. Työn tuloksena syntyi uusi suurnopeuskuvaustekniikkaan nojautuva tuotekehitysprosessi, jossa uusi tekniikka kyetään liittämäänosaksi tutkimus- ja kehitystyötä. Tuloksia voidaan käyttää myös markkinoinnin ja myynnin edistämiseen.
Resumo:
A highly sensitive ultra-high performance liquid chromatography tandem mass spectrometry (UHPLC-MS/MS) method was developed for the quantification of buprenorphine and its major metabolite norbuprenorphine in human plasma. In order to speed up the process and decrease costs, sample preparation was performed by simple protein precipitation with acetonitrile. To the best of our knowledge, this is the first application of this extraction technique for the quantification of buprenorphine in plasma. Matrix effects were strongly reduced and selectivity increased by using an efficient chromatographic separation on a sub-2μm column (Acquity UPLC BEH C18 1.7μm, 2.1×50mm) in 5min with a gradient of ammonium formate 20mM pH 3.05 and acetonitrile as mobile phase at a flow rate of 0.4ml/min. Detection was made using a tandem quadrupole mass spectrometer operating in positive electrospray ionization mode, using multiple reaction monitoring. The procedure was fully validated according to the latest Food and Drug Administration guidelines and the Société Française des Sciences et Techniques Pharmaceutiques. Very good results were obtained by using a stable isotope-labeled internal standard for each analyte, to compensate for the variability due to the extraction and ionization steps. The method was very sensitive with lower limits of quantification of 0.1ng/ml for buprenorphine and 0.25ng/ml for norbuprenorphine. The upper limit of quantification was 250ng/ml for both drugs. Trueness (98.4-113.7%), repeatability (1.9-7.7%), intermediate precision (2.6-7.9%) and internal standard-normalized matrix effects (94-101%) were in accordance with international recommendations. The procedure was successfully used to quantify plasma samples from patients included in a clinical pharmacogenetic study and can be transferred for routine therapeutic drug monitoring in clinical laboratories without further development.
Resumo:
Tässä tutkimuksessa on keskitytty tutkimaan kuparituotetehtaan tuoterakennetta ja tilausten kohdistumispisteitä sekä esittämäänparannusehdotuksia näiden suhteen. Valssaamon prosessivarasto on kokenut huomattavan pienennyksen ja tavoitteena on, että tästä huolimatta tuotantoa pystyttäisiin ohjaamaan vähemmin resurssein, materiaalipulasta kärsimättä ja nopeammin läpimenoajoin. Lisäksi työssä esitetään muita tuotantoprosessiin liittyviä kehitysehdotuksia, jotka tukevat tuotannon virtaviivaistamista ja varastosaldojen vähentämistä. Teoriaosuudessa selitetään Lean-tuotannon toimivuuttavalssaamon kaltaisessa toimintaympäristössä. Teoriaosuudessa on käsitelty Lean-tuotannon lisäksi myös Agile- ja Leagile-teorioita, koska myös näiden teorioidenyhteensopivuus valssaamon tuotantoon on merkittävä. Empiirisessä osassa on kuvattu tuotantoprosessin ja tuoterakenteen nykytila sekä esitetty kehitysehdotuksianäiden kehittämiseen esiteltyjen teorioiden pohjalta. Tutkimuksen perusteella esitetään muutoksia tämänhetkiseen tuoterakenteeseen, koska nykyisen kaltainen tuoterakenne on jäänyt osittain turhaksi varastojen siirryttyä tuotannon alkupäähän. Lisäksi ehdotetaan kuumavalssaussuunnitelmasta luopumista ja kuumavalssaimen ohjaussyklin lyhentämistä vuorokauden mittaiseksi, sekä esitetään, miten tuotantoon tulisi välittää nykyistä tarkempaa tietoa tilauksien valmistumisajankohdista.
Resumo:
Työn tavoitteena oli luoda yritykseen soveltuva, toimiva laatujärjestelmän kirjallinen kuvaus sekä parantaa laadun hallintaa. Laatutoimintojen kehittämisessä perustana käytettiin laatujärjestelmästandardia SFS-EN ISO 9001. Työn lähtökohtana oli perusteellinen nykytilan selvitys. Saatujen tulosten perusteella voitiin kohdistaa toimenpiteet oikein, jolloin laadun parannus oli merkittävää ja mitattavissa. Lähtötilanteessa havaittiin suuria laatuongelmia niin asiakkaiden keskuudessa kuin yrityksen sisäisissä toiminnoissa, verrattaessa ISO 9001 standardiin. Prosessit pyrittiin kirjaamaan sekä aloitustilanteessaettä havaittujen puutteiden korjaamisen jälkeen. Tästä dokumentaatiosta muodostettiin pohja yrityksen laatujärjestelmän kuvaukselle - laatukäsikirjalle. Dokumentoitua työ- tai toiminto-ohjetta kehitettiin edelleen vastaamaan parasta havaittua toimintatapaa.Työ osoitti laadun hallinnan parantuneen merkittävästi, kun prosessi on onnistuttu kuvaamaan käytännönläheisesti ja yksinkertaisesti. Ratkaisevan tärkeää on prosessin suorittamiseen osallistuvan henkilöstön panos kuvattaessa ja kehitettäessä toimintoa. Ilman henkilöstön sitouttamista kuvaukset ja ohjeet jäävät vain hyödyttömiksi papereiksi, joita ei noudateta eikä päivitetä. Laadun, kuten minkään muunkaan parametrin, kehitystä on mahdoton seurata ilman selkeitä ja yksiselitteisiä mittareita. Työssä luotiin tärkeimpiinprosesseihin kehitystä ilmaisevat laatumittarit. Parhaimmillaan havaittiin maalaamossa suoritettujen toimenpiteiden seurauksena virheellisten kappaleiden osuuden pudonneen kertaluokkaa pienemmäksi. Laatujärjestelmän kehitystyö osoittautui tämänkin kokoisessa yrityksessä erittäin työlääksi ja aikaa vieväksi. Kuitenkin on selvää, että laadunhallinnan parantuessa ja todistusaineiston karttuessa onnistumisten kautta, lisääntyy koko yrityksen usko laadun parantamisen tärkeyteen ja näin nopeuttaa joskus kipeidenkin uudistusten läpivientiä.
Resumo:
Technological progress has made a huge amount of data available at increasing spatial and spectral resolutions. Therefore, the compression of hyperspectral data is an area of active research. In somefields, the original quality of a hyperspectral image cannot be compromised andin these cases, lossless compression is mandatory. The main goal of this thesisis to provide improved methods for the lossless compression of hyperspectral images. Both prediction- and transform-based methods are studied. Two kinds of prediction based methods are being studied. In the first method the spectra of a hyperspectral image are first clustered and and an optimized linear predictor is calculated for each cluster. In the second prediction method linear prediction coefficients are not fixed but are recalculated for each pixel. A parallel implementation of the above-mentioned linear prediction method is also presented. Also,two transform-based methods are being presented. Vector Quantization (VQ) was used together with a new coding of the residual image. In addition we have developed a new back end for a compression method utilizing Principal Component Analysis (PCA) and Integer Wavelet Transform (IWT). The performance of the compressionmethods are compared to that of other compression methods. The results show that the proposed linear prediction methods outperform the previous methods. In addition, a novel fast exact nearest-neighbor search method is developed. The search method is used to speed up the Linde-Buzo-Gray (LBG) clustering method.
Resumo:
Tieto- ja teleliikenneverkkojen konvergenssi on tuonut uusia vaatimuksia palvelukehitysympäristöille ja aiheuttanut haasteita ympäristöjen kehitykselle. Moderneilla palvelukehitysympäristöillä on pystyttävä tuottamaan nopeasti monimutkaisia ja samalla varmatoimisia palveluja. Lisäksi moniprotokollapalveluiden luontiympäristöjen on mukauduttava uusiin olosuhteisiin, jotta palveluntarjoajat pysyisivät kilpailukykyisinä. Tämän työn tarkoituksena oli etsiä menetelmiä ja apuvälineitä nopeaan ja luotettavaan konvergoivissa verkoissa tarjottavien palveluiden luontiin. Työssä tutustuttiin markkinoilla oleviin palvelukehitysympäristöihin ja esiteltiin Intellitel OSN:n palvelukehitysympäristö ja sen palvelunluontimalli, joka tukee palvelunkehitystä läpi koko palvelunluontiprosessin. Työn käytäntöosuudessa parannettiin Intellitelin palvelunluontimallia ja palvelukehitysympäristön tarjoamia työkaluja ja apuohjelmia. Työssä toteutettiin Intellitelin palvelukehitysympäristöllä vaiheittain palvelunluontimallin mukaisesti numeronmuunnospalvelu.