994 resultados para SUGGESTED METHODS
Resumo:
A population-based cross-sectional survey of socio-environmental factors associated with the prevalence of Dracunculus medinensis (guinea worm disease) was conducted in Idere, a rural agricultural community in Ibarapa, Oyo state, Nigeria, during 1982.^ The epidemiologic data were collected by household interview of all 501 households. The environmental data were collected by analysis of water samples collected from all domestic water sources and rainfall records.^ The specific objectives of this research were to: (a) Describe the prevalence of guinea worm disease in Idere during 1982 by age, sex, area of residence, drinking water source, religion and weekly amount of money spent by the household to collect potable drinking water. (b) Compare the characteristics of cases with non-cases of guinea worm in order to identify factors associated with high risk of infection. (c) Investigate domestic water sources for the distribution of Cyclops. (d) Determine the extent of potable water shortage with a view to identifying factors responsible for such shortage in the community. (e) Describe the effects of guinea worm on school attendance during 1980/1982 school years by class and location of school from piped water supply.^ The findings of this research indicate that during 1982, 31.8 percent of Idere's 6,527 residents experienced guinea worm infection, with higher prevalence of infection recorded in males in their most productive years and females in their teenage years. The role of sex and age to risk of higher infection rate was explained in the context of water related exposure and water intake due to dehydration from physical occupational actitives of subgroups.^ Potable water available to residents was considerably below the minimum recommended by WHO for tropical climates, with sixty-eight percent of water needs of the residents coming from unprotected surface water which harbour Cyclops, the obligatory intermediate host of Dracunculus medinensis. An association was found between periods of relative high density of Cyclops in domestic water and rainfall.^ Impact of guinea worm infection on educational activities was considerable and its implications were discussed, including the implications of the research findings in relation to control of guinea worm disease in Ibarapa. ^
Resumo:
The central issue for pillar design in underground coal mining is the in situ uniaxial compressive strength (sigma (cm)). The paper proposes a new method for estimating in situ uniaxial compressive strength in coal seams based on laboratory strength and P wave propagation velocity. It describes the collection of samples in the Bonito coal seam, Fontanella Mine, southern Brazil, the techniques used for the structural mapping of the coal seam and determination of seismic wave propagation velocity as well as the laboratory procedures used to determine the strength and ultrasonic wave velocity. The results obtained using the new methodology are compared with those from seven other techniques for estimating in situ rock mass uniaxial compressive strength.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
Kalvotekniikan suurin ongelma on edelleen kalvon likaantuminen, jonka tuloksena kalvon erotuskyky voi muuttua ja liuoksen vuo kalvon läpi pientyä huomattavasti. Kalvotekniikan teollisissa sovelluksissa kalvojen puhdistus on yksi tärkeimmistä pääkohdista, sillä se määrittää kalvon käyttöikää ja käyttötehokkuutta. Yleisimmin käytetyn kemiallisen pesun tuloksena muodostuu hävitettävä pesuliuos, joka sisältää sekä kemikaaleja, että kalvosta poistetun lian. Työssä on tutkittu kalvon puhdistusta ultravioletti-valo- tai ultraäänikäsittelyssä titaanidioksidin läsnäollessa. Menetelmien mahdollisina etuina ovat kalvosta poistetun lian hajotus harmittomiksi komponenteiksi ja mahdollisesti kalvon pienempi kuluminen pesussa. Työn kirjallisuusosassa on käsitelty orgaanisten ja epäorgaanisten aineiden hajottamista ultraviolettivalon tai ultraäänikäsittelyn avulla titaanidioksidin läsnäollessa sekä olosuhteiden vaikutusta menetelmien tehokkuuteen. Tämän lisäksi työssä on keskitytty polymeerikalvojen UV-valo- ja ultraäänikäsittelykestävyyteen. Kokeellisessa osassa on tutkittu UV-valo- ja ultraäänikäsittelyjen sopivuutta liatun PVDF-kalvon puhdistukseen titaanidioksidin läsnäollessa. Tavoitteena oli liatun kalvon permeabiliteetin palautus puhtaan kalvon tasolle käsittelyn avulla. Kalvon kestävyyttä on myös tutkittu. Tämän työn perusteella tutkittuja menetelmiä ei voida soveltaa tarkistettavan PVDF-kalvon puhdistukseen, ainakaan testeissä käytetyissä olosuhteissa, sillä kalvon ominaisuudet muuttuvat käsittelyissä.
Resumo:
« La version intégrale de ce mémoire est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l’Université de Montréal (www.bib.umontreal.ca/MU). »
Resumo:
List of topics and Slides which summarise legal perspectives with suggested methods on how to revise for the exam
Resumo:
Solutions to combinatorial optimization, such as p-median problems of locating facilities, frequently rely on heuristics to minimize the objective function. The minimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. However, pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small branch of the literature suggests using statistical principles to estimate the minimum and use the estimate for either stopping or evaluating the quality of the solution. In this paper we use test-problems taken from Baesley's OR-library and apply Simulated Annealing on these p-median problems. We do this for the purpose of comparing suggested methods of minimum estimation and, eventually, provide a recommendation for practioners. An illustration ends the paper being a problem of locating some 70 distribution centers of the Swedish Post in a region.
Resumo:
Pós-graduação em Ciência e Tecnologia de Materiais - FC
Resumo:
Despite several clinical tests that have been developed to qualitatively describe complex motor tasks by functional testing, these methods often depend on clinicians' interpretation, experience and training, which make the assessment results inconsistent, without the precision required to objectively assess the effect of the rehabilitative intervention. A more detailed characterization is required to fully capture the various aspects of motor control and performance during complex movements of lower and upper limbs. The need for cost-effective and clinically applicable instrumented tests would enable quantitative assessment of performance on a subject-specific basis, overcoming the limitations due to the lack of objectiveness related to individual judgment, and possibly disclosing subtle alterations that are not clearly visible to the observer. Postural motion measurements at additional locations, such as lower and upper limbs and trunk, may be necessary in order to obtain information about the inter-segmental coordination during different functional tests involved in clinical practice. With these considerations in mind, this Thesis aims: i) to suggest a novel quantitative assessment tool for the kinematics and dynamics evaluation of a multi-link kinematic chain during several functional motor tasks (i.e. squat, sit-to-stand, postural sway), using one single-axis accelerometer per segment, ii) to present a novel quantitative technique for the upper limb joint kinematics estimation, considering a 3-link kinematic chain during the Fugl-Meyer Motor Assessment and using one inertial measurement unit per segment. The suggested methods could have several positive feedbacks from clinical practice. The use of objective biomechanical measurements, provided by inertial sensor-based technique, may help clinicians to: i) objectively track changes in motor ability, ii) provide timely feedback about the effectiveness of administered rehabilitation interventions, iii) enable intervention strategies to be modified or changed if found to be ineffective, and iv) speed up the experimental sessions when several subjects are asked to perform different functional tests.
Resumo:
Specific cutting energy (SE) has been widely used to assess the rock cuttability for mechanical excavation purposes. Some prediction models were developed for SE through correlating rock properties with SE values. However, some of the textural and compositional rock parameters i.e. texture coefficient and feldspar, mafic, and felsic mineral contents were not considered. The present study is to investigate the effects of previously ignored rock parameters along with engineering rock properties on SE. Mineralogical and petrographic analyses, rock mechanics, and linear rock cutting tests were performed on sandstone samples taken from sites around Ankara, Turkey. Relationships between SE and rock properties were evaluated using bivariate correlation and linear regression analyses. The tests and subsequent analyses revealed that the texture coefficient and feldspar content of sandstones affected rock cuttability, evidenced by significant correlations between these parameters and SE at a 90% confidence level. Felsic and mafic mineral contents of sandstones did not exhibit any statistically significant correlation against SE. Cementation coefficient, effective porosity, and pore volume had good correlations against SE. Poisson's ratio, Brazilian tensile strength, Shore scleroscope hardness, Schmidt hammer hardness, dry density, and point load strength index showed very strong linear correlations against SE at confidence levels of 95% and above, all of which were also found suitable to be used in predicting SE individually, depending on the results of regression analysis, ANOVA, Student's t-tests, and R-2 values. Poisson's ratio exhibited the highest correlation with SE and seemed to be the most reliable SE prediction tool in sandstones.
Resumo:
The modelling of mechanical structures using finite element analysis has become an indispensable stage in the design of new components and products. Once the theoretical design has been optimised a prototype may be constructed and tested. What can the engineer do if the measured and theoretically predicted vibration characteristics of the structure are significantly different? This thesis considers the problems of changing the parameters of the finite element model to improve the correlation between a physical structure and its mathematical model. Two new methods are introduced to perform the systematic parameter updating. The first uses the measured modal model to derive the parameter values with the minimum variance. The user must provide estimates for the variance of the theoretical parameter values and the measured data. Previous authors using similar methods have assumed that the estimated parameters and measured modal properties are statistically independent. This will generally be the case during the first iteration but will not be the case subsequently. The second method updates the parameters directly from the frequency response functions. The order of the finite element model of the structure is reduced as a function of the unknown parameters. A method related to a weighted equation error algorithm is used to update the parameters. After each iteration the weighting changes so that on convergence the output error is minimised. The suggested methods are extensively tested using simulated data. An H frame is then used to demonstrate the algorithms on a physical structure.
Resumo:
Selecting the best alternative in a group decision making is a subject of many recent studies. The most popular method proposed for ranking the alternatives is based on the distance of each alternative to the ideal alternative. The ideal alternative may never exist; hence the ranking results are biased to the ideal point. The main aim in this study is to calculate a fuzzy ideal point that is more realistic to the crisp ideal point. On the other hand, recently Data Envelopment Analysis (DEA) is used to find the optimum weights for ranking the alternatives. This paper proposes a four stage approach based on DEA in the Fuzzy environment to aggregate preference rankings. An application of preferential voting system shows how the new model can be applied to rank a set of alternatives. Other two examples indicate the priority of the proposed method compared to the some other suggested methods.
Resumo:
Problems for intellectualisation for man-machine interface and methods of self-organization for network control in multi-agent infotelecommunication systems have been discussed. Architecture and principles for construction of network and neural agents for telecommunication systems of new generation have been suggested. Methods for adaptive and multi-agent routing for information flows by requests of external agents- users of global telecommunication systems and computer networks have been described.
Resumo:
A carne continua a ser a fonte proteica mais comum no quotidiano das pessoas. Além disso, os produtos cárneos processados apresentam-se como uma mais-valia nas suas vidas agitadas. Este tipo de produto torna difícil a diferenciação das carnes utilizadas na sua confecção, sendo por isso propícios a adulteração. A Reacção em Cadeia da Polimerase (PCR) tem ganho cada vez mais importância nos laboratórios de biologia molecular, revelando-se uma técnica de análise rápida, sensível e altamente específica na identificação de espécies em produtos alimentares. No entanto, vários factores podem interferir com o processo de amplificação, pelo que alguns cuidados devem ser implementados desde a aquisição da amostra a analisar, ao seu acondicionamento e posterior extração de ADN. Existem inúmeros protocolos de extração de ADN, devendo para cada estudo avaliar-se e optar-se pelo mais adequado, considerando a finalidade estabelecida para a amostra extraída. O trabalho laboratorial apresentado nesta dissertação baseou-se em três etapas principais. Inicialmente, avaliaram-se diferentes protocolos de extração de ADN, utilizando-se amostras de carne adquiridas num talho. Entre os protocolos testados, o método de Brometo de Cetil-Trimetil-Amónio (CTAB) modificado foi o que permitiu obter amostras de ADN com maior concentração e elevado nível de pureza. Posteriormente, foram testados e optimizados diferentes protocolos de amplificação, por PCR em tempo real, para a detecção das espécies Bos taurus (vaca), Sus scrofa (porco), Equus caballus (cavalo) e Ovis aries (ovelha). Foram empregues primers específicos de espécie para a detecção de genes mitocondriais e genómicos, consoante cada protocolo. Para o caso concreto do porco, foi efectuada a avaliação de dois protocolos, singleplex com EvaGreen® e tetraplex com AllHorse, para possível aplicação dos mesmos na sua quantificação. Os resultados demonstraram elevada especificidade e sensibilidade das reacções para esta espécie, permitindo a sua detecção até um limite de 0,001 ng e 0,1%, respectivamente. Somente a primeira metodologia se mostrou adequada para quantificação. Por último, as metodologias sugeridas foram aplicadas com sucesso na análise de 4 amostras comerciais de hambúrgueres, tendo-se verificado a consistência da rotulagem em todos os casos, no que concerne a composição em termos de espécies animais. O interesse de trabalhos neste âmbito recai na importância da autenticidade dos rótulos de produtos alimentares, principalmente nos produtos cárneos, para segurança dos consumidores e salvaguarda dos produtores.