960 resultados para new methods
Resumo:
Työn tavoitteena oli tutkia kilpa- ja harjoitusammuntaan käytettävällä haulikkoradalla käytettäväksi soveltuvia passiivisia meluntorjuntamenetelmiä ja suunnitella niistä toimiva kokonaisuus kohderadoille siten, että rata-alueen läheisillä asuinalueilla alitetaan 50 dB LAIMax-taso sekä muilla alueilla ampumaratamelun ohjearvot. Aluksi tarkasteltiin kohderatojen vanhoja menetelmiä ja radan meluntorjunnan kannalta olennaiset suunnat selvitettiin mittauksilla. Myös laukausmelun spektriä tutkittiin mittauksilla. Olemassa oleville menetelmille etsittiin korvaavia ja täydentäviä menetelmiä. Radan sijoittaminen maastoon on tärkeä osa meluntorjuntaa uutta rataa suunniteltaessa, mutta kohteena olleiden ratojen sijoittamista ei voitu enää muuttaa. Koska haulikkoammunnan säännöt eivät mahdollista aliääniammusten tai äänenvaimentimien käyttöä, ei varsinaista melupäästöä pystytä pienentämään. Melulle altistuvan kohteen sisätilojen suojaaminenkin on käytännössä hyödytöntä, koska ampumamelun ohjearvot ovat ulkoalueille. Ainoastaan melun absorptio, etenemisen estäminen ja ohjaaminen ovat kelvollisia menetelmiä kohderadoilla. Rata-alueelle suunniteltiin uudet ratojen väliset meluseinät, joiden ääneneristävyys ja -absorptio on vanhoja seiniä parempi. Taustavallille suunniteltiin jatkamista maavallina ja korottamista meluaitana, joka toimii myös haulienkierrätysjärjestelmän runkona. Alueen maaperä suunniteltiin muutettavaksi akustisesti pehmeämmäksi. Käytettävien menetelmien suunnittelussa tehtiin eri vaihtoehdoille laskennallisia arvioita, tietokonemallinnuksia ja laboratoriokokeita. Myös uusia menetelmiä haulikkoradan meluntorjunnassa tutkittiin. Tutkimusten perusteella näitä menetelmiä tutkitaan jatkossa käytännössä.
Resumo:
Tutkimuksen tavoitteena oli rakentaa case yritykselle malli lyhyen aikavälin kannattavuuden estimointia varten. Tutkimusmetodi on konstruktiivinen, ja malli kehitettiin laskentaihmisten avustuksella. Teoriaosassa käytiin kirjallisuuskatsauksen avulla läpi kannattavuutta, budjetointia sekä itse ennustamista. Teoriaosassa pyrittiin löytämään sellaisia menetelmiä, joita voitaisiin käyttää lyhyen aikavälin kannattavuuden estimoinnissa. Rakennettavalle mallille asetettujen vaatimusten mukaan menetelmäksi valittiin harkintaan perustuva menetelmä (judgmental). Tutkimuksen mukaan kannattavuuteen vaikuttaa myyntihinta ja –määrä, tuotanto, raaka-aineiden hinnat ja varaston muutos. Rakennettu malli toimii kohdeyrityksessä kohtalaisen hyvin ja huomattavaa on se, että eri tehtaiden ja eri koneiden väliset erot saattavat olla kohtuullisen suuret. Nämä erot johtuvat pääasiassa tehtaan koosta ja mallien erilaisuudesta. Mallin käytännön toimivuus tulee kuitenkin parhaiten selville silloin, kun se on laskentaihmisten käytössä. Ennustamiseen liittyy kuitenkin aina omat ongelmansa ja uudetkaan menetelmät eivät välttämättä poista näitä ongelmia.
Resumo:
Tissue engineering is a popular topic in peripheral nerve repair. Combining a nerve conduit with supporting adipose-derived cells could offer an opportunity to prevent time-consuming Schwann cell culture or the use of an autograft with its donor site morbidity and eventually improve clinical outcome. The aim of this study was to provide a broad overview over promising transplantable cells under equal experimental conditions over a long-term period. A 10-mm gap in the sciatic nerve of female Sprague-Dawley rats (7 groups of 7 animals, 8 weeks old) was bridged through a biodegradable fibrin conduit filled with rat adipose-derived stem cells (rASCs), differentiated rASCs (drASCs), human (h)ASCs from the superficial and deep abdominal layer, human stromal vascular fraction (SVF), or rat Schwann cells, respectively. As a control, we resutured a nerve segment as an autograft. Long-term evaluation was carried out after 12 weeks comprising walking track, morphometric, and MRI analyses. The sciatic functional index was calculated. Cross sections of the nerve, proximal, distal, and in between the two sutures, were analyzed for re-/myelination and axon count. Gastrocnemius muscle weights were compared. MRI proved biodegradation of the conduit. Differentiated rat ASCs performed significantly better than undifferentiated rASCs with less muscle atrophy and superior functional results. Superficial hASCs supported regeneration better than deep hASCs, in line with published in vitro data. The best regeneration potential was achieved by the drASC group when compared with other adipose tissue-derived cells. Considering the ease of procedure from harvesting to transplanting, we conclude that comparison of promising cells for nerve regeneration revealed that particularly differentiated ASCs could be a clinically translatable route toward new methods to enhance peripheral nerve repair.
Resumo:
There is great scientific and popular interest in understanding the genetic history of populations in the Americas. We wish to understand when different regions of the continent were inhabited, where settlers came from, and how current inhabitants relate genetically to earlier populations. Recent studies unraveled parts of the genetic history of the continent using genotyping arrays and uniparental markers. The 1000 Genomes Project provides a unique opportunity for improving our understanding of population genetic history by providing over a hundred sequenced low coverage genomes and exomes from Colombian (CLM), Mexican-American (MXL), and Puerto Rican (PUR) populations. Here, we explore the genomic contributions of African, European, and especially Native American ancestry to these populations. Estimated Native American ancestry is 48% in MXL, 25% in CLM, and 13% in PUR. Native American ancestry in PUR is most closely related to populations surrounding the Orinoco River basin, confirming the Southern American ancestry of the Taíno people of the Caribbean. We present new methods to estimate the allele frequencies in the Native American fraction of the populations, and model their distribution using a demographic model for three ancestral Native American populations. These ancestral populations likely split in close succession: the most likely scenario, based on a peopling of the Americas 16 thousand years ago (kya), supports that the MXL Ancestors split 12.2kya, with a subsequent split of the ancestors to CLM and PUR 11.7kya. The model also features effective populations of 62,000 in Mexico, 8,700 in Colombia, and 1,900 in Puerto Rico. Modeling Identity-by-descent (IBD) and ancestry tract length, we show that post-contact populations also differ markedly in their effective sizes and migration patterns, with Puerto Rico showing the smallest effective size and the earlier migration from Europe. Finally, we compare IBD and ancestry assignments to find evidence for relatedness among European founders to the three populations.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been used successfully in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits; to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
Diplomityössä perehdyttiin toimitusketjun hallinnan ja johtamisen apuvälineenä käytettävään SCOR-malliin (Supply Chain Operations Refence-model). SCOR on standardimalli, josta jokaisen tulee poimia oman toiminnan kehittämisen ja tehostamisen kannalta tärkeät asiat. SCOR on hyvä instrumentti kasvun hallinnassa.Työn teoriaosa käsittelee logistiikkaa, toimitusketjun hallintaa, ostotoimintaa ja SCOR-mallia. Soveltavassa osassa käsitellään SCOR-mallin avulla sisäilma alalla toimivan yrityksen prosesseja. Työssä keskityttiin tarkastelemaan yrityksen ostotoiminta ja valmistusprosesseja SCOR-mallin avulla.SCOR-mallin käyttö oston ja tuotannon analysoinnissa toi esiin useita toiminnan kehitysajatuksia. Näitä olivat uusien menetelmien käyttöönotto varaston ja tuotannon hallittavuuden parantamiseksi, osto-organisaation selkiyttäminen ja toiminnanohjausjärjestelmän kehittäminen.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been successfully used in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits, to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
Viime vuosikymmenen aikana avoimet ja alustariippumattomat XML-pohjaiset standardikehikot ovat nousseet vakavaksi haastajiksi vanhoille EDI-järjestelmille. Uusilla menetelmillä tavoitellaan joustavampia toimintatapoja, laajempaa yhteensopivuutta ja alustariippumattomuutta. Työn tarkoituksena oli kehittää avoimia standardeja hyödyntävä rajapinta aiemmin toteutettuun WWW-palveluun. Teoriaosuudessa esitellään kolme avointa sähköisen liiketoiminnan kehikkoa, RosettaNet, Web service ja ebXML. Työn käytännön osuudessa toteutettiin sähköisen liiketoiminnan palvelu, jonka avulla organisaatiot voivat välittää dokumentteja keskenään. Palvelu toteutetiin Web service -tyylillä; se käyttää HTTP-protokollaa tiedonsiirtoon ja lähettää ja vastaanottaa tarkasti määriteltyjä XML-sanomia. Eräs keskisuuri suomalainen yritys käyttää toteutettua palvelua verkkolaskujen liitetietojen lähetyksen ja vastaanoton automatisointiin.
Resumo:
Because of the heavily overlapping symptoms, pathogen-specific diagnosis and treatment of infectious diseases is difficult based on clinical symptoms alone. Therefore, patients are often treated empirically. More efficient treatment and management of infectious diseases would require rapid point-of-care compatible in vitro diagnostic methods. However, current point-of-care methods are unsatisfactory in performance and in cost structure. The lack of pointof- care methods results in unnecessary use of antibiotics, suboptimal use of virus-specific drugs, and compromised patient care. In this thesis, the applicability of a two-photon excitation fluorometry is evaluated as a tool for rapid detection of infectious diseases. New separation-free immunoassay methodologies were developed and validated for the following application areas: general inflammation markers, pathogen-specific antibodies, pathogen-specific antigens, and antimicrobial susceptibility testing. In addition, dry-reagent methodology and nanoparticulate tracers are introduced in context to the technique. The results show that the new assay technique is a versatile tool for rapid detection of infectious diseases in many different application areas. One particularly attractive area is rapid multianalyte testing of respiratory infections, where the technique was shown to allow simple assay protocols and comparable performance to the state-of-the-art laboratory methods. If implemented in clinical diagnostic use, the new methods could improve diagnostic testing routines, especially in rapid testing of respiratory tract infections.
Resumo:
Performance standards for Positron emission tomography (PET) were developed to be able to compare systems from different generations and manufacturers. This resulted in the NEMA methodology in North America and the IEC in Europe. In practices, the NEMA NU 2- 2001 is the method of choice today. These standardized methods allow assessment of the physical performance of new commercial dedicated PET/CT tomographs. The point spread in image formation is one of the factors that blur the image. The phenomenon is often called the partial volume effect. Several methods for correcting for partial volume are under research but no real agreement exists on how to solve it. The influence of the effect varies in different clinical settings and it is likely that new methods are needed to solve this problem. Most of the clinical PET work is done in the field of oncology. The whole body PET combined with a CT is the standard investigation today in oncology. Despite the progress in PET imaging technique visualization, especially quantification of small lesions is a challenge. In addition to partial volume, the movement of the object is a significant source of error. The main causes of movement are respiratory and cardiac motions. Most of the new commercial scanners are in addition to cardiac gating, also capable of respiratory gating and this technique has been used in patients with cancer of the thoracic region and patients being studied for the planning of radiation therapy. For routine cardiac applications such as assessment of viability and perfusion only cardiac gating has been used. However, the new targets such as plaque or molecular imaging of new therapies require better control of the cardiac motion also caused by respiratory motion. To overcome these problems in cardiac work, a dual gating approach has been proposed. In this study we investigated the physical performance of a new whole body PET/CT scanner with NEMA standard, compared methods for partial volume correction in PET studies of the brain and developed and tested a new robust method for dual cardiac-respiratory gated PET with phantom, animal and human data. Results from performance measurements showed the feasibility of the new scanner design in 2D and 3D whole body studies. Partial volume was corrected, but there is no best method among those tested as the correction also depends on the radiotracer and its distribution. New methods need to be developed for proper correction. The dual gating algorithm generated is shown to handle dual-gated data, preserving quantification and clearly eliminating the majority of contraction and respiration movement
Resumo:
Fatal and permanently disabling accidents form only one per I cent of all occupational accidents but in many branches of industry they account for more than half the accident costs. Furthermore the human suffering of the victim and his family is greater in severe accidents than in slight ones. For both human and economic reasons the severe accident risks should be identified befor injuries occur. It is for this purpose that different safety analysis methods have been developed . This study shows two new possible approaches to the problem.. The first is the hypothesis that it is possible to estimate the potential severity of accidents independent of the actual severity. The second is the hypothesis that when workers are also asked to report near accidents, they are particularly prone to report potentially severe near accidents on the basis of their own subjective risk assessment. A field study was carried out in a steel factory. The results supported both the hypotheses. The reliability and the validity of post incident estimates of an accident's potential severity were reasonable. About 10 % of accidents were estimated to be potentially critical; they could have led to death or very severe permanent disability. Reported near accidents were significantly more severe, about 60 $ of them were estimated to be critical. Furthermore the validity of workers subjective risk assessment, manifested in the near accident reports, proved to be reasonable. The studied new methods require further development and testing. They could be used both in routine usage in work places and in research for identifying and setting the priorities of accident risks.
Resumo:
The chemical analysis of the acetone, chloroform, toluene and methanol extracts of a pitch sample was carried out by IR and GC-MS, leading to the identification of sixty nine compounds, including fatty acids, alcohols and hydrocarbons. Analysis of the acetone extractive of a eucalyptus wood used in Brazil for pulp production was also carried out, resulting in identification of fifty nine compounds, including mainly fatty acids, phenolic compounds, beta-sitosterol and other steroids. This analysis showed that pitch formation had a contribution from wood extractives and other sources of contamination. The results obtained and the methodology applied can be used by the pulp industry to develop new methods of pitch control.
Resumo:
The study focuses on the front end of innovation process. Due to changes in innovation policies and paradigms customers, users and shopfloor employees are becoming increasingly important sources of knowledge. New methods are needed for processing information and ideas coming from multiple sources more effectively. The aim of this study is to develop an idea evaluation tool suitable for the front end of innovation process and capable of utilizing collective intelligence. The study is carried out as a case study research using constructive research approach. The chosen approach suits well for the purposes of the study. The constructive approach focuses on designing new constructs and testing them in real life applications. In this study a tool for evaluating ideas emerging from the course of everyday work is developed and tested in a case organization. Development of the tool is based on current scientific literature on knowledge creation, innovation management and collective intelligence and it is tested in LUT Lahti School of Innovation. Results are encouraging. The idea evaluation tool manages to improve performance at the front end of innovation process and it is accepted in use in the case organization. This study provides insights on what kind of a tool is required for facilitating collective intelligence at the front end of innovation process.
Resumo:
Many industrial processes produce effluents with a wide variety of xenobiotic organic pollutants, which cannot be efficiently degraded by conventional biological treatments. Thus, the development of new technologies to eliminate these refractory compounds in water has become very imperative in order to assure the quality of this important resource. Ozonation is a very promising process for the treatment of wastewaters containing non-easily removable organic compounds. The present work aims at highlighting new methods of enhancing the efficiency of ozone towards the removal organic pollutants in aqueous solution. Special attention is given to catalytic ozonation processes contemplating homo- and heterogeneous catalysis, their activity and mechanisms. Recent results and future prospects about the application of these processes to real effluents are also evaluated.