827 resultados para tool use
Resumo:
Concept maps are a technique used to obtain a visual representation of a person's ideas about a concept or a set of related concepts. Specifically, in this paper, through a qualitative methodology, we analyze the concept maps proposed by 52 groups of teacher training students in order to find out the characteristics of the maps and the degree of adequacy of the contents with regard to the teaching of human nutrition in the 3rd cycle of primary education. The participants were enrolled in the Teacher Training Degree majoring in Primary Education, and the data collection was carried out through a training activity under the theme of what to teach about Science in Primary School? The results show that the maps are a useful tool for working in teacher education as they allow organizing, synthesizing, and communicating what students know. Moreover, through this work, it has been possible to see that future teachers have acceptable skills for representing the concepts/ideas in a concept map, although the level of adequacy of concepts/ideas about human nutrition and its relations is usually medium or low. These results are a wake-up call for teacher training, both initial and ongoing, because they shows the inability to change priorities as far as the selection of content is concerned.
Resumo:
This article aims to investigate the possibilities of building puppets in art therapy workshop, for this it realizes a tour of the primitive use of the puppet as a magical twin and potential towards the study of authors who have used the puppets as a therapeutic tool from the first half of the twentieth century. It’s raised an own theoretical organization, which includes the consideration of the significance of the body in the construction and management of the puppet and the puppet transitional perspective, halfway of external reality and psychic reality as an object that makes fantasy and reality arises built.
Resumo:
Legionella pneumophila, the causative agent of a severe pneumonia named Legionnaires' disease, is an important human pathogen that infects and replicates within alveolar macrophages. Its virulence depends on the Dot/Icm type IV secretion system (T4SS), which is essential to establish a replication permissive vacuole known as the Legionella containing vacuole (LCV). L. pneumophila infection can be modeled in mice however most mouse strains are not permissive, leading to the search for novel infection models. We have recently shown that the larvae of the wax moth Galleria mellonella are suitable for investigation of L. pneumophila infection. G. mellonella is increasingly used as an infection model for human pathogens and a good correlation exists between virulence of several bacterial species in the insect and in mammalian models. A key component of the larvae's immune defenses are hemocytes, professional phagocytes, which take up and destroy invaders. L. pneumophila is able to infect, form a LCV and replicate within these cells. Here we demonstrate protocols for analyzing L. pneumophila virulence in the G. mellonella model, including how to grow infectious L. pneumophila, pretreat the larvae with inhibitors, infect the larvae and how to extract infected cells for quantification and immunofluorescence microscopy. We also describe how to quantify bacterial replication and fitness in competition assays. These approaches allow for the rapid screening of mutants to determine factors important in L. pneumophila virulence, describing a new tool to aid our understanding of this complex pathogen.
Resumo:
The creation of Causal Loop Diagrams (CLDs) is a major phase in the System Dynamics (SD) life-cycle, since the created CLDs express dependencies and feedback in the system under study, as well as, guide modellers in building meaningful simulation models. The cre-ation of CLDs is still subject to the modeller's domain expertise (mental model) and her ability to abstract the system, because of the strong de-pendency on semantic knowledge. Since the beginning of SD, available system data sources (written and numerical models) have always been sparsely available, very limited and imperfect and thus of little benefit to the whole modelling process. However, in recent years, we have seen an explosion in generated data, especially in all business related domains that are analysed via Business Dynamics (BD). In this paper, we intro-duce a systematic tool supported CLD creation approach, which analyses and utilises available disparate data sources within the business domain. We demonstrate the application of our methodology on a given business use-case and evaluate the resulting CLD. Finally, we propose directions for future research to further push the automation in the CLD creation and increase confidence in the generated CLDs.
Resumo:
Qualitative Comparative Analysis (QCA) is a method for the systematic analysis of cases. A holistic view of cases and an approach to causality emphasizing complexity are some of its core features. Over the last decades, QCA has found application in many fields of the social sciences. In spite of this, its use in feminist research has been slower, and only recently QCA has been applied to topics related to social care, the political representation of women, and reproductive politics. In spite of the comparative turn in feminist studies, researchers still privilege qualitative methods, in particular case studies, and are often skeptical of quantitative techniques (Spierings 2012). These studies show that the meaning and measurement of many gender concepts differ across countries and that the factors leading to feminist success and failure are context specific. However, case study analyses struggle to systematically account for the ways in which these forces operate in different locations.
Resumo:
The business model of an organization is an important strategic tool for its success, and should therefore be understood by business professionals and information technology professionals. By this context and considering the importance of information technology in contemporary business models, this article aims to verify the use of the business model components in the information technology (IT) projects management process in enterprises. To achieve this goal, this exploratory research has investigated the use of the Business Model concept in the information technology projects management, by a survey applied to 327 professionals from February to April 2012. It was observed that the business model concept, as well as its practices or its blocks, are not so well explored in its whole potential, possibly because it is relatively new. One of the benefits of this conceptual tool is to provide an understanding in terms of the core business for different areas, enabling a higher level of knowledge in terms of the essential activities of the enterprise IT professionals and the business area.
Resumo:
Landnutzungsänderungen sind eine wesentliche Ursache von Treibhausgasemissionen. Die Umwandlung von Ökosystemen mit permanenter natürlicher Vegetation hin zu Ackerbau mit zeitweise vegetationslosem Boden (z.B. nach der Bodenbearbeitung vor der Aussaat) führt häufig zu gesteigerten Treibhausgasemissionen und verminderter Kohlenstoffbindung. Weltweit dehnt sich Ackerbau sowohl in kleinbäuerlichen als auch in agro-industriellen Systemen aus, häufig in benachbarte semiaride bis subhumide Rangeland Ökosysteme. Die vorliegende Arbeit untersucht Trends der Landnutzungsänderung im Borana Rangeland Südäthiopiens. Bevölkerungswachstum, Landprivatisierung und damit einhergehende Einzäunung, veränderte Landnutzungspolitik und zunehmende Klimavariabilität führen zu raschen Veränderungen der traditionell auf Tierhaltung basierten, pastoralen Systeme. Mittels einer Literaturanalyse von Fallstudien in ostafrikanischen Rangelands wurde im Rahmen dieser Studie ein schematisches Modell der Zusammenhänge von Landnutzung, Treibhausgasemissionen und Kohlenstofffixierung entwickelt. Anhand von Satellitendaten und Daten aus Haushaltsbefragungen wurden Art und Umfang von Landnutzungsänderungen und Vegetationsveränderungen an fünf Untersuchungsstandorten (Darito/Yabelo Distrikt, Soda, Samaro, Haralo, Did Mega/alle Dire Distrikt) zwischen 1985 und 2011 analysiert. In Darito dehnte sich die Ackerbaufläche um 12% aus, überwiegend auf Kosten von Buschland. An den übrigen Standorten blieb die Ackerbaufläche relativ konstant, jedoch nahm Graslandvegetation um zwischen 16 und 28% zu, während Buschland um zwischen 23 und 31% abnahm. Lediglich am Standort Haralo nahm auch „bare land“, vegetationslose Flächen, um 13% zu. Faktoren, die zur Ausdehnung des Ackerbaus führen, wurden am Standort Darito detaillierter untersucht. GPS Daten und anbaugeschichtlichen Daten von 108 Feldern auf 54 Betrieben wurden in einem Geographischen Informationssystem (GIS) mit thematischen Boden-, Niederschlags-, und Hangneigungskarten sowie einem Digitales Höhenmodell überlagert. Multiple lineare Regression ermittelte Hangneigung und geographische Höhe als signifikante Erklärungsvariablen für die Ausdehnung von Ackerbau in niedrigere Lagen. Bodenart, Entfernung zum saisonalen Flusslauf und Niederschlag waren hingegen nicht signifikant. Das niedrige Bestimmtheitsmaß (R²=0,154) weist darauf hin, dass es weitere, hier nicht erfasste Erklärungsvariablen für die Richtung der räumlichen Ausweitung von Ackerland gibt. Streudiagramme zu Ackergröße und Anbaujahren in Relation zu geographischer Höhe zeigen seit dem Jahr 2000 eine Ausdehnung des Ackerbaus in Lagen unter 1620 müNN und eine Zunahme der Schlaggröße (>3ha). Die Analyse der phänologischen Entwicklung von Feldfrüchten im Jahresverlauf in Kombination mit Niederschlagsdaten und normalized difference vegetation index (NDVI) Zeitreihendaten dienten dazu, Zeitpunkte besonders hoher (Begrünung vor der Ernte) oder niedriger (nach der Bodenbearbeitung) Pflanzenbiomasse auf Ackerland zu identifizieren, um Ackerland und seine Ausdehnung von anderen Vegetationsformen fernerkundlich unterscheiden zu können. Anhand der NDVI Spektralprofile konnte Ackerland gut Wald, jedoch weniger gut von Gras- und Buschland unterschieden werden. Die geringe Auflösung (250m) der Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI Daten führte zu einem Mixed Pixel Effect, d.h. die Fläche eines Pixels beinhaltete häufig verschiedene Vegetationsformen in unterschiedlichen Anteilen, was deren Unterscheidung beeinträchtigte. Für die Entwicklung eines Echtzeit Monitoring Systems für die Ausdehnung des Ackerbaus wären höher auflösende NDVI Daten (z.B. Multispektralband, Hyperion EO-1 Sensor) notwendig, um kleinräumig eine bessere Differenzierung von Ackerland und natürlicher Rangeland-Vegetation zu erhalten. Die Entwicklung und der Einsatz solcher Methoden als Entscheidungshilfen für Land- und Ressourcennutzungsplanung könnte dazu beitragen, Produktions- und Entwicklungsziele der Borana Landnutzer mit nationalen Anstrengungen zur Eindämmung des Klimawandels durch Steigerung der Kohlenstofffixierung in Rangelands in Einklang zu bringen.
Resumo:
MAIDL, André Murbach; CARVILHE, Claudio; MUSICANTE, Martin A. Maude Object-Oriented Action Tool. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2008.
Resumo:
The Highway Safety Manual (HSM) is the compilation of national safety research that provides quantitative methods for analyzing highway safety. The HSM presents crash modification functions related to freeway work zone characteristics such as work zone duration and length. These crash modification functions were based on freeway work zones with high traffic volumes in California. When the HSM-referenced model was calibrated for Missouri, the value was 3.78, which is not ideal since it is significantly larger than 1. Therefore, new models were developed in this study using Missouri data to capture geographical, driver behavior, and other factors in the Midwest. Also, new models for expressway and rural two-lane work zones that barely were studied in the literature were developed. A large sample of 20,837 freeway, 8,993 expressway, and 64,476 rural two-lane work zones in Missouri was analyzed to derive 15 work zone crash prediction models. The most appropriate samples of 1,546 freeway, 1,189 expressway, and 6,095 rural two-lane work zones longer than 0.1 mile and with a duration of greater than 10 days were used to make eight, four, and three models, respectively. A challenging question for practitioners is always how to use crash prediction models to make the best estimation of work zone crash count. To solve this problem, a user-friendly software tool was developed in a spreadsheet format to predict work zone crashes based on work zone characteristics. This software selects the best model, estimates the work zone crashes by severity, and converts them to monetary values using standard crash estimates. This study also included a survey of departments of transportation (DOTs), Federal Highway Administration (FHWA) representatives, and contractors to assess the current state of the practice regarding work zone safety. The survey results indicate that many agencies look at work zone safety informally using engineering judgment. Respondents indicated that they would like a tool that could help them to balance work zone safety across projects by looking at crashes and user costs.
Resumo:
L’évaluation de l’action humanitaire (ÉAH) est un outil valorisé pour soutenir l’imputabilité, la transparence et l’efficience de programmes humanitaires contribuant à diminuer les inéquités et à promouvoir la santé mondiale. L’EAH est incontournable pour les parties prenantes de programme, les bailleurs de fonds, décideurs et intervenants souhaitant intégrer les données probantes aux pratiques et à la prise de décisions. Cependant, l’utilisation de l’évaluation (UÉ) reste incertaine, l’ÉAH étant fréquemment menée, mais inutilisé. Aussi, les conditions influençant l’UÉ varient selon les contextes et leur présence et applicabilité au sein d’organisations non-gouvernementales (ONG) humanitaires restent peu documentées. Les évaluateurs, parties prenantes et décideurs en contexte humanitaire souhaitant assurer l’UÉ pérenne détiennent peu de repères puisque rares sont les études examinant l’UÉ et ses conditions à long terme. La présente thèse tend à clarifier ces enjeux en documentant sur une période de deux ans l’UÉ et les conditions qui la détermine, au sein d’une stratégie d’évaluation intégrée au programme d’exemption de paiement des soins de santé d’une ONG humanitaire. L’objectif de ce programme est de faciliter l’accès à la santé aux mères, aux enfants de moins de cinq ans et aux indigents de districts sanitaires au Niger et au Burkina Faso, régions du Sahel où des crises alimentaires et économiques ont engendré des taux élevés de malnutrition, de morbidité et de mortalité. Une première évaluation du programme d’exemption au Niger a mené au développement de la stratégie d’évaluation intégrée à ce même programme au Burkina Faso. La thèse se compose de trois articles. Le premier présente une étude d’évaluabilité, étape préliminaire à la thèse et permettant de juger de sa faisabilité. Les résultats démontrent une logique cohérente et plausible de la stratégie d’évaluation, l’accessibilité de données et l’utilité d’étudier l’UÉ par l’ONG. Le second article documente l’UÉ des parties prenantes de la stratégie et comment celle-ci servit le programme d’exemption. L’utilisation des résultats fut instrumentale, conceptuelle et persuasive, alors que l’utilisation des processus ne fut qu’instrumentale et conceptuelle. Le troisième article documente les conditions qui, selon les parties prenantes, ont progressivement influencé l’UÉ. L’attitude des utilisateurs, les relations et communications interpersonnelles et l’habileté des évaluateurs à mener et à partager les connaissances adaptées aux besoins des utilisateurs furent les conditions clés liées à l’UÉ. La thèse contribue à l’avancement des connaissances sur l’UÉ en milieu humanitaire et apporte des recommandations aux parties prenantes de l’ONG.
Resumo:
This work explores the development of MemTri. A memory forensics triage tool that can assess the likelihood of criminal activity in a memory image, based on evidence data artefacts generated by several applications. Fictitious illegal suspect activity scenarios were performed on virtual machines to generate 60 test memory images for input into MemTri. Four categories of applications (i.e. Internet Browsers, Instant Messengers, FTP Client and Document Processors) are examined for data artefacts located through the use of regular expressions. These identified data artefacts are then analysed using a Bayesian Network, to assess the likelihood that a seized memory image contained evidence of illegal activity. Currently, MemTri is under development and this paper introduces only the basic concept as well as the components that the application is built on. A complete description of MemTri coupled with extensive experimental results is expected to be published in the first semester of 2017.
Resumo:
The early bird and night owl restaurant tool found in the accompanying Excel file provides an estimate of the effects of offering off-peak special menu prices. Unlike the classic back-of-envelope calculation, the tool includes the effect of anticipated cannibalization of full-price covers and seeks to optimize table use. The tool also considers the revenue from new customers attracted by the early bird or night owl promotions, as well as the level of increased business needed to achieve the net monetary value target for the promotion.
Resumo:
In the past, many papers have been presented which show that the coating of cutting tools often yields decreased wear rates and reduced coefficients of friction. Although different theories are proposed, covering areas such as hardness theory, diffusion barrier theory, thermal barrier theory, and reduced friction theory, most have not dealt with the question of how and why the coating of tool substrates with hard materials such as Titanium Nitride (TiN), Titanium Carbide (TiC) and Aluminium Oxide (Al203) transforms the performance and life of cutting tools. This project discusses the complex interrelationship that encompasses the thermal barrier function and the relatively low sliding friction coefficient of TiN on an undulating tool surface, and presents the result of an investigation into the cutting characteristics and performance of EDMed surface-modified carbide cutting tool inserts. The tool inserts were coated with TiN by the physical vapour deposition (PVD) method. PVD coating is also known as Ion-plating which is the general term of the coating method in which the film is created by attracting ionized metal vapour in this the metal was Titanium and ionized gas onto negatively biased substrate surface. Coating by PVD was chosen because it is done at a temperature of not more than 5000C whereas chemical Vapour Deposition CVD process is done at very high temperature of about 8500C and in two stages of heating up the substrates. The high temperatures involved in CVD affects the strength of the (tool) substrates. In this study, comparative cutting tests using TiN-coated control specimens with no EDM surface structures and TiN-coated EDMed tools with a crater-like surface topography were carried out on mild steel grade EN-3. Various cutting speeds were investigated, up to an increase of 40% of the tool manufacturer’s recommended speed. Fifteen minutes of cutting were carried out for each insert at the speeds investigated. Conventional tool inserts normally have a tool life of approximately 15 minutes of cutting. After every five cuts (passes) microscopic pictures of the tool wear profiles were taken, in order to monitor the progressive wear on the rake face and on the flank of the insert. The power load was monitored for each cut taken using an on-board meter on the CNC machine to establish the amount of power needed for each stage of operation. The spindle drive for the machine is an 11 KW/hr motor. Results obtained confirmed the advantages of cutting at all speeds investigated using EDMed coated inserts, in terms of reduced tool wear and low power loads. Moreover, the surface finish on the workpiece was consistently better for the EDMed inserts. The thesis discusses the relevance of the finite element method in the analysis of metal cutting processes, so that metal machinists can design, manufacture and deliver goods (tools) to the market quickly and on time without going through the hassle of trial and error approach for new products. Improvements in manufacturing technologies require better knowledge of modelling metal cutting processes. Technically the use of computational models has a great value in reducing or even eliminating the number of experiments traditionally used for tool design, process selection, machinability evaluation, and chip breakage investigations. In this work, much interest in theoretical and experimental investigations of metal machining were given special attention. Finite element analysis (FEA) was given priority in this study to predict tool wear and coating deformations during machining. Particular attention was devoted to the complicated mechanisms usually associated with metal cutting, such as interfacial friction; heat generated due to friction and severe strain in the cutting region, and high strain rates. It is therefore concluded that Roughened contact surface comprising of peaks and valleys coated with hard materials (TiN) provide wear-resisting properties as the coatings get entrapped in the valleys and help reduce friction at chip-tool interface. The contributions to knowledge: a. Relates to a wear-resisting surface structure for application in contact surfaces and structures in metal cutting and forming tools with ability to give wear-resisting surface profile. b. Provide technique for designing tool with roughened surface comprising of peaks and valleys covered in conformal coating with a material such as TiN, TiC etc which is wear-resisting structure with surface roughness profile compose of valleys which entrap residual coating material during wear thereby enabling the entrapped coating material to give improved wear resistance. c. Provide knowledge for increased tool life through wear resistance, hardness and chemical stability at high temperatures because of reduced friction at the tool-chip and work-tool interfaces due to tool coating, which leads to reduced heat generation at the cutting zones. d. Establishes that Undulating surface topographies on cutting tips tend to hold coating materials longer in the valleys, thus giving enhanced protection to the tool and the tool can cut faster by 40% and last 60% longer than conventional tools on the markets today.
Resumo:
This report describes a tool for global optimization that implements the Differential Evolution optimization algorithm as a new Excel add-in. The tool takes a step beyond Excel’s Solver add-in, because Solver often returns a local minimum, that is, a minimum that is less than or equal to nearby points, while Differential Evolution solves for the global minimum, which includes all feasible points. Despite complex underlying mathematics, the tool is relatively easy to use, and can be applied to practical optimization problems, such as establishing pricing and awards in a hotel loyalty program. The report demonstrates an example of how to develop an optimum approach to that problem.
Resumo:
The highly dynamic nature of some sandy shores with continuous morphological changes require the development of efficient and accurate methodological strategies for coastal hazard assessment and morphodynamic characterisation. During the past decades, the general methodological approach for the establishment of coastal monitoring programmes was based on photogrammetry or classical geodetic techniques. With the advent of new geodetic techniques, space-based and airborne-based, new methodologies were introduced in coastal monitoring programmes. This paper describes the development of a monitoring prototype that is based on the use of global positioning system (GPS). The prototype has a GPS multiantenna mounted on a fast surveying platform, a land vehicle appropriate for driving in the sand (four-wheel quad). This system was conceived to perform a network of shore profiles in sandy shores stretches (subaerial beach) that extend for several kilometres from which high-precision digital elevation models can be generated. An analysis of the accuracy and precision of some differential GPS kinematic methodologies is presented. The development of an adequate survey methodology is the first step in morphodynamic shore characterisation or in coastal hazard assessment. The sample method and the computational interpolation procedures are important steps for producing reliable three-dimensional surface maps that are real as possible. The quality of several interpolation methods used to generate grids was tested in areas where there were data gaps. The results obtained allow us to conclude that with the developed survey methodology, it is possible to Survey sandy shores stretches, under spatial scales of kilometers, with a vertical accuracy of greater than 0.10 m in the final digital elevation models.