898 resultados para tool - use
Resumo:
A novel cantilever pressure sensor was developed in the Department of Physics at the University of Turku in order to solve the sensitivity problems which are encountered when condenser microphones are used in photoacoustic spectroscopy. The cantilever pressure sensor, combined with a laser interferometer for the measurement of the cantilever movements, proved to be highly sensitive. The original aim of this work was to integrate the sensor in a photoacoustic gas detector working in a differential measurement scheme. The integration was made successfully into three prototypes. In addition, the cantilever was also integrated in the photoacoustic FTIR measurement schemes of gas-, liquid-, and solid-phase samples. A theoretical model for the signal generation in each measurement scheme was created and the optimal celldesign discussed. The sensitivity and selectivity of the differential method were evaluated when a blackbody radiator and a mechanical chopper were used with CO2, CH4, CO, and C2H4 gases. The detection limits were in the sub-ppm level for all four gases with only a 1.3 second integration time and the cross interference was well below one percent for all gas combinations other than those between hydrocarbons. Sensitivity with other infrared sources was compared using ethylene as an example gas. In the comparison of sensitivity with different infrared sources the electrically modulated blackbody radiator gave a 35 times higher and the CO2-laser a 100 times lower detection limit than the blackbody radiator with a mechanical chopper. As a conclusion, the differential system is well suited to rapid single gas measurements. Gas-phase photoacoustic FTIR spectroscopy gives the best performance, when several components have to be analyzed simultaneously from multicomponent samples. Multicomponent measurements were demonstrated with a sample that contained different concentrations of CO2, H2O, CO, and four different hydrocarbons. It required an approximately 10 times longer measurement time to achieve the same detection limit for a single gas as with the differential system. The properties of the photoacoustic FTIR spectroscopy were also compared to conventional transmission FTIR spectroscopy by simulations. Solid- and liquid-phase photoacoustic FTIR spectroscopy has several advantages compared to other techniques and therefore it also has a great variety of applications. A comparison of the signal-to-noise ratio between photoacoustic cells with a cantilever microphone and a condenser microphone was done with standard carbon black, polyethene, and sunflower oil samples. The cell with the cantilever microphone proved to have a 5-10 times higher signal-to-noise ratio than the reference detector, depending on the sample. Cantilever enhanced photoacoustics will be an effective tool for gas detection and analysis of solid- and liquid-phase samples. The preliminary prototypes gave good results in all three measurement schemes that were studied. According to simulations, there are possibilities for further enhancement of the sensitivity, as well as other properties, of each system.
Resumo:
Nonnative brook trout Salvelinus fontinalis are abundant in Pine Creek and its main tributary, Bogard Spring Creek, California. These creeks historically provided the most spawning and rearing habitat for endemic Eagle Lake rainbow trout Oncorhynchus mykiss aquilarum. Three-pass electrofishing removal was conducted in 2007–2009 over the entire 2.8-km length of Bogard Spring Creek to determine whether brook trout removal was a feasible restoration tool and to document the life history characteristics of brook trout in a California meadow stream. After the first 2 years of removal, brook trout density and biomass were severely reduced from 15,803 to 1,192 fish/ha and from 277 to 31 kg/ha, respectively. Average removal efficiency was 92–97%, and most of the remaining fish were removed in the third year. The lack of a decrease in age-0 brook trout abundance between 2007 and 2008 after the removal of more than 4,000 adults in 2007 suggests compensatory reproduction of mature fish that survived and higher survival of age-0 fish. However, recruitment was greatly reduced after 2 years of removal and is likely to be even more depressed after the third year of removal assuming that immigration of fish from outside the creek continues to be minimal. Brook trout condition, growth, and fecundity indicated a stunted population at the start of the study, but all three features increased significantly every year, demonstrating compensatory effects. Although highly labor intensive, the use of electrofishing to eradicate brook trout may be feasible in Bogard Spring Creek and similar small streams if removal and monitoring are continued annually and if other control measures (e.g., construction of barriers) are implemented. Our evidence shows that if brook trout control measures continue and if only Eagle Lake rainbow trout are allowed access to the creek, then a self-sustaining population ofEagle Lake rainbow trout can become reestablished
Resumo:
One of the techniques used to detect faults in dynamic systems is analytical redundancy. An important difficulty in applying this technique to real systems is dealing with the uncertainties associated with the system itself and with the measurements. In this paper, this uncertainty is taken into account by the use of intervals for the parameters of the model and for the measurements. The method that is proposed in this paper checks the consistency between the system's behavior, obtained from the measurements, and the model's behavior; if they are inconsistent, then there is a fault. The problem of detecting faults is stated as a quantified real constraint satisfaction problem, which can be solved using the modal interval analysis (MIA). MIA is used because it provides powerful tools to extend the calculations over real functions to intervals. To improve the results of the detection of the faults, the simultaneous use of several sliding time windows is proposed. The result of implementing this method is semiqualitative tracking (SQualTrack), a fault-detection tool that is robust in the sense that it does not generate false alarms, i.e., if there are false alarms, they indicate either that the interval model does not represent the system adequately or that the interval measurements do not represent the true values of the variables adequately. SQualTrack is currently being used to detect faults in real processes. Some of these applications using real data have been developed within the European project advanced decision support system for chemical/petrochemical manufacturing processes and are also described in this paper
Resumo:
The present thesis in focused on the minimization of experimental efforts for the prediction of pollutant propagation in rivers by mathematical modelling and knowledge re-use. Mathematical modelling is based on the well known advection-dispersion equation, while the knowledge re-use approach employs the methods of case based reasoning, graphical analysis and text mining. The thesis contribution to the pollutant transport research field consists of: (1) analytical and numerical models for pollutant transport prediction; (2) two novel techniques which enable the use of variable parameters along rivers in analytical models; (3) models for the estimation of pollutant transport characteristic parameters (velocity, dispersion coefficient and nutrient transformation rates) as functions of water flow, channel characteristics and/or seasonality; (4) the graphical analysis method to be used for the identification of pollution sources along rivers; (5) a case based reasoning tool for the identification of crucial information related to the pollutant transport modelling; (6) and the application of a software tool for the reuse of information during pollutants transport modelling research. These support tools are applicable in the water quality research field and in practice as well, as they can be involved in multiple activities. The models are capable of predicting pollutant propagation along rivers in case of both ordinary pollution and accidents. They can also be applied for other similar rivers in modelling of pollutant transport in rivers with low availability of experimental data concerning concentration. This is because models for parameter estimation developed in the present thesis enable the calculation of transport characteristic parameters as functions of river hydraulic parameters and/or seasonality. The similarity between rivers is assessed using case based reasoning tools, and additional necessary information can be identified by using the software for the information reuse. Such systems represent support for users and open up possibilities for new modelling methods, monitoring facilities and for better river water quality management tools. They are useful also for the estimation of environmental impact of possible technological changes and can be applied in the pre-design stage or/and in the practical use of processes as well.
Resumo:
Rich and Suter diagrams are a very useful tool to explain the electron configurations of all transition elements, and in particular, the s¹ and s0 configurations of the elements Cr, Cu, Nb, Mo, Ru, Rh, Pd, Ag, and Pt. The application of these diagrams to the inner transition elements also explains the electron configurations of lanthanoids and actinoids, except for Ce, Pa, U, Np, and Cm, whose electron configurations are indeed very special because they are a mixture of several configurations.
Resumo:
Intermolecular forces are a useful concept that can explain the attraction between particulate matter as well as numerous phenomena in our lives such as viscosity, solubility, drug interactions, and dyeing of fibers. However, studies show that students have difficulty understanding this important concept, which has led us to develop a free educational software in English and Portuguese. The software can be used interactively by teachers and students, thus facilitating better understanding. Professors and students, both graduate and undergraduate, were questioned about the software quality and its intuitiveness of use, facility of navigation, and pedagogical application using a Likert scale. The results led to the conclusion that the developed computer application can be characterized as an auxiliary tool to assist teachers in their lectures and students in their learning process of intermolecular forces.
Resumo:
The coat protein gene of Apple stem grooving virus (ASGV) was amplified by RT-PCR, cloned, sequenced and subcloned in the expression vector pMal-c2. This plasmid was used to transform Escherichia coli BL21c+ competent cells. The ASGV coat protein (cp) was expressed as a fusion protein containing a fragment of E. coli maltose binding protein (MBP). Bacterial cells were disrupted by sonication and the ASGVcp/MBP fusion protein was purified by amylose resin affinity chromatography. Polyclonal antibodies from rabbits immunized with the fusion protein gave specific reactions to ASGV from infected apple (Malus domestica) cv. Fuji Irradiada and Chenopodium quinoa at dilutions of up to 1:1,000 and 1:2,000, respectively, in plate trapped ELISA. The ASGVcp/MBP fusion protein reacted to a commercial antiserum against ASGV in immunoblotting assay. The IgG against ASGVcp/MBP performed favorably in specificity and sensitivity to the virus. This method represents an additional tool for the efficient ASGV-indexing of apple propagative and mother stock materials, and for use in support of biological and molecular techniques.
Resumo:
Radiostereometric analysis (RSA) is a highly accurate method for the measurement of in vivo micromotion of orthopaedic implants. Validation of the RSA method is a prerequisite for performing clinical RSA studies. Only a limited number of studies have utilised the RSA method in the evaluation of migration and inducible micromotion during fracture healing. Volar plate fixation of distal radial fractures has increased in popularity. There is still very little prospective randomised evidence supporting the use of these implants over other treatments. The aim of this study was to investigate the precision, accuracy, and feasibility of using RSA in the evaluation of healing in distal radius fractures treated with a volar fixed-angle plate. A physical phantom model was used to validate the RSA method for simple distal radius fractures. A computer simulation model was then used to validate the RSA method for more complex interfragmentary motion in intra-articular fractures. A separate pre-clinical investigation was performed in order to evaluate the possibility of using novel resorbable markers for RSA. Based on the validation studies, a prospective RSA cohort study of fifteen patients with plated AO type-C distal radius fractures with a 1-year follow-up was performed. RSA was shown to be highly accurate and precise in the measurement of fracture micromotion using both physical and computer simulated models of distal radius fractures. Resorbable RSA markers demonstrated potential for use in RSA. The RSA method was found to have a high clinical precision. The fractures underwent significant translational and rotational migration during the first two weeks after surgery, but not thereafter. Maximal grip caused significant translational and rotational interfragmentary micromotion. This inducible micromotion was detectable up to eighteen weeks, even after the achievement of radiographic union. The application of RSA in the measurement of fracture fragment migration and inducible interfragmentary micromotion in AO type-C distal radius fractures is feasible but technically demanding. RSA may be a unique tool in defining the progress of fracture union.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
Early identification of beginning readers at risk of developing reading and writing difficulties plays an important role in the prevention and provision of appropriate intervention. In Tanzania, as in other countries, there are children in schools who are at risk of developing reading and writing difficulties. Many of these children complete school without being identified and without proper and relevant support. The main language in Tanzania is Kiswahili, a transparent language. Contextually relevant, reliable and valid instruments of identification are needed in Tanzanian schools. This study aimed at the construction and validation of a group-based screening instrument in the Kiswahili language for identifying beginning readers at risk of reading and writing difficulties. In studying the function of the test there was special interest in analyzing the explanatory power of certain contextual factors related to the home and school. Halfway through grade one, 337 children from four purposively selected primary schools in Morogoro municipality were screened with a group test consisting of 7 subscales measuring phonological awareness, word and letter knowledge and spelling. A questionnaire about background factors and the home and school environments related to literacy was also used. The schools were chosen based on performance status (i.e. high, good, average and low performing schools) in order to include variation. For validation, 64 children were chosen from the original sample to take an individual test measuring nonsense word reading, word reading, actual text reading, one-minute reading and writing. School marks from grade one and a follow-up test half way through grade two were also used for validation. The correlations between the results from the group test and the three measures used for validation were very high (.83-.95). Content validity of the group test was established by using items drawn from authorized text books for reading in grade one. Construct validity was analyzed through item analysis and principal component analysis. The difficulty level of most items in both the group test and the follow-up test was good. The items also discriminated well. Principal component analysis revealed one powerful latent dimension (initial literacy factor), accounting for 93% of the variance. This implies that it could be possible to use any set of the subtests of the group test for screening and prediction. The K-Means cluster analysis revealed four clusters: at-risk children, strugglers, readers and good readers. The main concern in this study was with the groups of at-risk children (24%) and strugglers (22%), who need the most assistance. The predictive validity of the group test was analyzed by correlating the measures from the two school years and by cross tabulating grade one and grade two clusters. All the correlations were positive and very high, and 94% of the at-risk children in grade two were already identified in the group test in grade one. The explanatory power of some of the home and school factors was very strong. The number of books at home accounted for 38% of the variance in reading and writing ability measured by the group test. Parents´ reading ability and the support children received at home for schoolwork were also influential factors. Among the studied school factors school attendance had the strongest explanatory power, accounting for 21% of the variance in reading and writing ability. Having been in nursery school was also of importance. Based on the findings in the study a short version of the group test was created. It is suggested for use in the screening processes in grade one aiming at identifying children at risk of reading and writing difficulties in the Tanzanian context. Suggestions for further research as well as for actions for improving the literacy skills of Tanzanian children are presented.
Resumo:
Objective: To develop and validate an instrument for measuring the acquisition of technical skills in conducting operations of increasing difficulty for use in General Surgery Residency (GSR) programs. Methods: we built a surgical skills assessment tool containing 11 operations in increasing levels of difficulty. For instrument validation we used the face validaity method. Through an electronic survey tool (Survey MonKey(r)) we sent a questionnaire to Full and Emeritus members of the Brazilian College of Surgeons - CBC - all bearers of the CBC Specialist Title. Results: Of the 307 questionnaires sent we received 100 responses. For the analysis of the data collected we used the Cronbach's alpha test. We observed that, in general, the overall alpha presented with values near or greater than 0.70, meaning good consistency to assess their points of interest. Conclusion: The evaluation instrument built was validated and can be used as a method of assessment of technical skill acquisition in the General Surgery Residency programs in Brazil.
Resumo:
Bronchoalveolar lavage (BAL) is a procedure that retrieves cells and other elements from the lungs for evaluation, which helps in the diagnosis of pulmonary diseases. The aim of this study was to perform this procedure for cellular analysis of BAL fluid alterations during experimental infection with Aelurostrongylus abstrusus in cats. Fourteen cats were individually inoculated with 800 third stage larvae of A. abstrusus and five non-infected cats lined as a control group. The BAL procedure was performed through the use of an endotracheal tube on the nineteen cats with a mean age of 18 months, on 0, 30, 60, 90, 120, 180 and 270 days after infection. Absolute cell counts in the infected cats revealed that alveolar macrophages and eosinophils were the predominant cells following infection. This study shows that the technique allows us to retrieve cells and first stage larvae what provides information about the inflammatory process caused by aelurostrongylosis.
Resumo:
Several tools of precision agriculture have been developed for specific uses. However, this specificity may hinder the implementation of precision agriculture due to an increasing in costs and operational complexity. The use of vegetation index sensors which are traditionally developed for crop fertilization, for site-specific weed management can provide multiple utilizations of these sensors and result in the optimization of precision agriculture. The aim of this study was to evaluate the relationship between reflectance indices of weeds obtained by the GreenSeekerTM sensor and conventional parameters used for weed interference quantification. Two experiments were conducted with soybean and corn by establishing a gradient of weed interference through the use of pre- and post-emergence herbicides. The weed quantification was evaluated by the normalized difference vegetation index (NDVI) and the ratio of red to near infrared (Red/NIR) obtained using the GreenSeekerTM sensor, the visual weed control, the weed dry matter, and digital photographs, which supplied information about the leaf area coverage proportions of weed and straw. The weed leaf coverage obtained using digital photography was highly associated with the NDVI (r = 0.78) and the Red/NIR (r = -0.74). The weed dry matter also positively correlated with the NDVI obtained in 1 m linear (r = 0.66). The results indicated that the GreenSeekerTM sensor originally used for crop fertilization could also be used to obtain reflectance indices in the area between rows of crops to support decision-making programs for weed control.
Resumo:
Oseltamivir phosphate is a potent viral inhibitor produced from shikimic acid extracted from seeds of Ilicium verum, the most important natural source. With the site of action 5-enolpyruvylshikimate-3-phosphate synthase (EPSP), glyphosate is the only compound capable of inhibiting its activity with the consequent accumulation of shikimic acid in plants. Corn and soybean plants were sprayed with reduced rates of glyphosate (0.0 to 230.4 g a.i. ha¹) and shikimic acid content in the dry mass was determined by HPLC 3, 7 and 10 days after application. Results showed shikimic acid accumulation in dry mass with increases of up to 969% in corn and 33,000% on soybeans, with peak concentrations 3 days after treatment (DAT). Industrial feasibility for shikimic acid production, combined with favorable climatic conditions for growing corn and soybean in virtually all over Brazil, favor the use of reduced rates of glyphosate in shikimic acid biosynthesis, with potential for use as an inducer in exploration of alternative sources for production of oseltamivir phosphate with low environmental impact.
Resumo:
The history of receptor autoradiography, its development and applications, testify to the utility of this histochemical technique for localizing radiolabeled hormones and drugs at cellular and subcellular sites of action in intact tissues. Localization of diffusible compounds has been a challenge that was met through the introduction of the "thaw-mount" and "dry-mount" autoradiographic techniques thirty years ago. With this cellular receptor autoradiography, used alone or combined with other histochemical techniques, sites of specific binding and deposition in vivo and in vitro have been characterized. Numerous discoveries, some reviewed in this article, provided information that led to new concepts and opened new areas of research. As an example, in recent years more than fifty target tissues for vitamin D have been specified, challenging the conventional view about the main biological role of vitamin D. The functions of most of these vitamin D target tissues are unrelated to the regulation of systemic calcium homeostasis, but pertain to the (seasonal) regulation of endo- and exocrine secretion, cell proliferation, reproduction, neural, immune and cardiovascular responses, and adaptation to stress. Receptor autoradiography with cellular resolution has become an indispensable tool in drug research and development, since information can be obtained that is difficult or impossible to gain otherwise