904 resultados para New cutting tool
Resumo:
The machining of hardened steel is becoming increasingly important in manufacturing processes. Machined parts made with hardened steel are often subjected to high service demands, which require great resistance and quality. The machining of this material submits the tools to high mechanical and thermal loads, which increases the tool wear and affects the surface integrity of the part. In that context, this work presents a study of drilling of AISI P20 steel with carbide tools, analyzing the effects on the process caused by the reduction of cutting fluid supply and its relation with the tool wear and the surface integrity of the piece. The major problem observed in the tests was a difficulty for chips to flow through the drill flute, compromising their expulsion from the hole. After a careful analysis, a different machining strategy was adopted to solve the problem
Resumo:
Current practices in agricultural management involve the application of rules and techniques to ensure high quality and environmentally friendly production. Based on their experience, agricultural technicians and farmers make critical decisions affecting crop growth while considering several interwoven agricultural, technological, environmental, legal and economic factors. In this context, decision support systems and the knowledge models that support them, enable the incorporation of valuable experience into software systems providing support to agricultural technicians to make rapid and effective decisions for efficient crop growth. Pest control is an important issue in agricultural management due to crop yield reductions caused by pests and it involves expert knowledge. This paper presents a formalisation of the pest control problem and the workflow followed by agricultural technicians and farmers in integrated pest management, the crop production strategy that combines different practices for growing healthy crops whilst minimising pesticide use. A generic decision schema for estimating infestation risk of a given pest on a given crop is defined and it acts as a metamodel for the maintenance and extension of the knowledge embedded in a pest management decision support system which is also presented. This software tool has been implemented by integrating a rule-based tool into web-based architecture. Evaluation from validity and usability perspectives concluded that both agricultural technicians and farmers considered it a useful tool in pest control, particularly for training new technicians and inexperienced farmers.
Resumo:
The description of all the species present in nature is a vast task to be fulfilled by using the classical approach of morphological description of the organisms. In recent years, the traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa and inconsistencies with the genetic data. Furthermore, the increasing need to get a true estimate of biodiversity has led Zoological Taxonomy to seek new approaches and methodologies to support the traditional methods. The classification procedure has added modern criteriasuch as the evolutionary relationships and the genetic, biochemical and morphological characteristics of the organisms.Until now the Linnean binomial was the only abbreviated code associated with the description of the morphology of a species. The new technologies aim to achieve a short nucleotide sequence of the DNA to be used as an unique and solely label for a particular species, a specific genetic barcode. For both morphological and genetic approaches, skills and experience are required. Taxonomy is one of zoological disciplines that has been benefited from the achievements reached by modern molecular biotechnology. Using a molecular approach it is possible to identify cryptic species, to establish a family relationship between species and their membership of taxonomic categories or to reconstruct the evolutionary history of a taxon.
Resumo:
Knowledge Exchange examined different routes in achieving the vision of 'having a layer of scholarly and scientific content openly available in the internet'. One of these routes involves exploring new developments in the future of publishing. Work is being undertaken investigating interesting alternative business models which could contribute to the transition to open access. In this light KE has commissioned a study investigating whether submission fees could play a role in a business model for Open Access journals. The general conclusion of the report bearing the title ‘Submission Fees a tool in the transition to open access?', written by Mark Ware, is that there are benefits to publishers in certain cases to switch to a model in which an author pays a fee when submitting an article. Especially journals with a high rejection rate might be interested in combining submission fees with article processing charges in order to make the transition to open access easier. In certain disciplines, notably economic and finance journals and in some areas of the experimental life sciences, submission fees are already common. Overall there seems to be an interest in the model but the risks, particularly those involved in any transition, are seen by the publishers to outweigh the perceived benefits. There is also a problem in that the advantages offered by submission fees are often general benefits that might improve the system but do not provide publishers and authors with direct incentives to change to open access. To support transition funders, institutions and publication funds could make it clear that submission fees would be an allowable cost. At present this is often unclear in their policies. Author acceptance of submission fees is critical to its success. It is an observable fact that authors will accept them in some circumstances. Author acceptance would require further study though. Based on the interviews and the modelling in the study one model in particular is regarded as the most suitable way to meet the current requirements (i.e. to strengthen open access to research publications). In this model authors pay a submission fee plus an Article Processing Fee and the article is subsequently made available in open access. Both fees are set at levels that balance acceptability with the author community with securing a meaningful mix of revenues for the Publisher.
Resumo:
Problem This dissertation presents a literature-based framework for communication in science (with the elements partners, purposes, message, and channel), which it then applies in and amends through an empirical study of how geoscientists use two social computing technologies (SCTs), blogging and Twitter (both general use and tweeting from conferences). How are these technologies used and what value do scientists derive from them? Method The empirical part used a two-pronged qualitative study, using (1) purposive samples of ~400 blog posts and ~1000 tweets and (2) a purposive sample of 8 geoscientist interviews. Blog posts, tweets, and interviews were coded using the framework, adding new codes as needed. The results were aggregated into 8 geoscientist case studies, and general patterns were derived through cross-case analysis. Results A detailed picture of how geoscientists use blogs and twitter emerged, including a number of new functions not served by traditional channels. Some highlights: Geoscientists use SCTs for communication among themselves as well as with the public. Blogs serve persuasion and personal knowledge management; Twitter often amplifies the signal of traditional communications such as journal articles. Blogs include tutorials for peers, reviews of basic science concepts, and book reviews. Twitter includes links to readings, requests for assistance, and discussions of politics and religion. Twitter at conferences provides live coverage of sessions. Conclusions Both blogs and Twitter are routine parts of scientists' communication toolbox, blogs for in-depth, well-prepared essays, Twitter for faster and broader interactions. Both have important roles in supporting community building, mentoring, and learning and teaching. The Framework of Communication in Science was a useful tool in studying these two SCTs in this domain. The results should encourage science administrators to facilitate SCT use of scientists in their organization and information providers to search SCT documents as an important source of information.
Resumo:
Here we characterize a new animal model that spontaneously develops chronic inflammation and fibrosis in multiple organs, the non-obese diabetic inflammation and fibrosis (N-IF) mouse. In the liver, the N-IF mouse displays inflammation and fibrosis particularly evident around portal tracts and central veins and accompanied with evidence of abnormal intrahepatic bile ducts. The extensive cellular infiltration consists mainly of macrophages, granulocytes, particularly eosinophils, and mast cells. This inflammatory syndrome is mediated by a transgenic population of natural killer T cells (NKT) induced in an immunodeficient NOD genetic background. The disease is transferrable to immunodeficient recipients, while polyclonal T cells from unaffected syngeneic donors can inhibit the disease phenotype. Because of the fibrotic component, early on-set, spontaneous nature and reproducibility, this novel mouse model provides a unique tool to gain further insight into the underlying mechanisms mediating transformation of chronic inflammation into fibrosis and to evaluate intervention protocols for treating conditions of fibrotic disorders.
Microincision Vitrectomy Trocars – Redefining Surgical Practices Through a New Range of Applications
Resumo:
Transconjunctival microincision vitrectomy surgery (MIVS) has grown increasingly popular among vitreoretinal surgeons over the last few years. Technical advances have led to the development of cutting-edge vitrectomy systems and instruments that significantly contributed to the success of MIVS. Trocar evolution has added extra safeness and effectiveness to the technique. In the hands of an experienced surgeon, microincision vitrectomy trocars offer a new range of applications that can redefine surgical practices and facilitate otherwise complex surgical techniques.
Resumo:
Capacity analysis using simulation is not a new thing in literature. Most of the development process of UMTS standardization have used simulation tools; however, we thing that the use of GIS planning tools and matrix manipulation capacity of MATLAB can show us different scenarios and make a more realistic analysis. Some work is been doing in COST 273 in order to have more realistic scenarios for UMTS planning. Our work initially was centered in uplink analysis, but we are now working in downlink analysis, specifically in two areas: capacity in number of users for RT and NRT services, and Node B power. In this work we will show results for up-link capacity and some results for downlink capacity and BS power consumption.
Resumo:
Efficient crop monitoring and pest damage assessments are key to protecting the Australian agricultural industry and ensuring its leading position internationally. An important element in pest detection is gathering reliable crop data frequently and integrating analysis tools for decision making. Unmanned aerial systems are emerging as a cost-effective solution to a number of precision agriculture challenges. An important advantage of this technology is it provides a non-invasive aerial sensor platform to accurately monitor broad acre crops. In this presentation, we will give an overview on how unmanned aerial systems and machine learning can be combined to address crop protection challenges. A recent 2015 study on insect damage in sorghum will illustrate the effectiveness of this methodology. A UAV platform equipped with a high-resolution camera was deployed to autonomously perform a flight pattern over the target area. We describe the image processing pipeline implemented to create a georeferenced orthoimage and visualize the spatial distribution of the damage. An image analysis tool has been developed to minimize human input requirements. The computer program is based on a machine learning algorithm that automatically creates a meaningful partition of the image into clusters. Results show the algorithm delivers decision boundaries that accurately classify the field into crop health levels. The methodology presented in this paper represents a venue for further research towards automated crop protection assessments in the cotton industry, with applications in detecting, quantifying and monitoring the presence of mealybugs, mites and aphid pests.
Resumo:
China's Silk Road Economic Belt plan is a part of One Belt, One Road initiative that aims to create trade routes from China all the way to Europe. Despite the potential benefits, there are also problems along the way. In this research I am examining the adverse effects of one part of the Silk Road Economic Belt with my focus on Xinjiang Uyghur minority and their rights and Central Asian regional stability. Moreover, I suggest that China's past commitments in the international society as well as her actions in relations to the undertaking can give an insight into a regime where China would be the dominant power in international society. I have used qualitative analysis to study the topics. My most important methodological tools to examine the topics are as follows. I utilise conceptual analysis to borrow concepts from international relations field. I use method of situation analysis when I am describing the current circumstances in China's Xinjiang and Central Asia. Inductive analysis is the overall method since I suggest that the content I have examined could give an insight to how China regards and relates to international law in the future. Moreover, my theoretical framework of the research sees international law as a tool that a state can use to gain more power but at the same time international law restricts state's behaviour. Based on the findings of this research, in case of Xinjiang the New Silk Road is likely to worsen Uyghurs situation because of Beijing's worries and harsh actions to prevent any disturbance. However, the New Silk Road could bring stability and maintain regional security in Central Asia when the states could see it beneficial to unite for cooperation which can result with greater benefits. China's potential future regime will emphasize sovereignty and non-interference to states’ domestic matters. Moreover, there will be no room for minority rights in China's concept of human rights. Human rights are meant to protect rights of masses but are of secondary importance since development and security will be more important goals to pursue. In the field of cooperation, China is increasingly using multilateral forums to discuss the matters but reserves bilateral negotiations for executing the plans.
Resumo:
The goal was to understand, document and module how information is currently flown internally in the largest dairy organization in Finland. The organization has undergone radical changes in the past years due to economic sanctions between European Union and Russia. Therefore, organization’s ultimate goal would be to continue its growth through managing its sales process more efficiently. The thesis consists of a literature review and an empirical part. The literature review consists of knowledge management and process modeling theories. First, the knowledge management discusses how data, information and knowledge are exchanged in the process. Knowledge management models and processes are describing how knowledge is created, exchanged and can be managed in an organization. Secondly, the process modeling is responsible for visualizing information flow through discussion of modeling approaches and presenting different methods and techniques. Finally, process’ documentation procedure was presented. In the end, a constructive research approach was used in order to identify process’ related problems and bottlenecks. Therefore, possible solutions were presented based on this approach. The empirical part of the study is based on 37 interviews, organization’s internal data sources and theoretical framework. The acquired data and information were used to document and to module the sales process in question with a flowchart diagram. Results are conducted through construction of the flowchart diagram and analysis of the documentation. In fact, answers to research questions are derived from empirical and theoretical parts. In the end, 14 problems and two bottlenecks were identified in the process. The most important problems are related to approach and/or standardization for information sharing, insufficient information technology tool utilization and lack of systematization of documentation. The bottlenecks are caused by the alarming amount of changes to files after their deadlines.
Resumo:
Nowadays it is still difficult to perform an early and accurate diagnosis of dementia, therefore many research focus on the finding of new dementia biomarkers that can aid in that purpose. So scientists try to find a noninvasive, rapid, and relatively inexpensive procedures for early diagnosis purpose. Several studies demonstrated that the utilization of spectroscopic techniques, such as Fourier Transform Infrared Spectroscopy (FTIR) and Raman spectroscopy could be an useful and accurate procedure to diagnose dementia. As several biochemical mechanisms related to neurodegeneration and dementia can lead to changes in plasma components and others peripheral body fluids, blood-based samples and spectroscopic analyses can be used as a more simple and less invasive technique. This work is intended to confirm some of the hypotheses of previous studies in which FTIR was used in the study of plasma samples of possible patient with AD and respective controls and verify the reproducibility of this spectroscopic technique in the analysis of such samples. Through the spectroscopic analysis combined with multivariate analysis it is possible to discriminate controls and demented samples and identify key spectroscopic differences between these two groups of samples which allows the identification of metabolites altered in this disease. It can be concluded that there are three spectral regions, 3500-2700 cm -1, 1800-1400 cm-1 and 1200-900 cm-1 where it can be extracted relevant spectroscopic information. In the first region, the main conclusion that is possible to take is that there is an unbalance between the content of saturated and unsaturated lipids. In the 1800-1400 cm-1 region it is possible to see the presence of protein aggregates and the change in protein conformation for highly stable parallel β-sheet. The last region showed the presence of products of lipid peroxidation related to impairment of membranes, and nucleic acids oxidative damage. FTIR technique and the information gathered in this work can be used in the construction of classification models that may be used for the diagnosis of cognitive dysfunction.
Resumo:
Background: Prevalence of psychosis is known to be higher in adults with intellectual disabilities (ID) than in the general adult population. However, there have been no attempts to develop a psychosis screening tool specifically for the adult ID population. The present study describes the development and preliminary evaluation of a new measure, the Glasgow Psychosis Screening tool for use in Adults with Intellectual Disabilities (GPS-ID). Method: An item pool was generated following: 1) focus groups with adults with ID and psychosis, and their carers and/or workers; 2) expert input from clinicians. A draft scale was compiled and refined following expert feedback. The new scale, along with the Psychotic Symptom Rating Scales was administered to 20 adults with ID (10 with and 10 without psychosis) and their relative or carers. Results: The GPS-ID total score, self-report subscale and informant rating-subscale differentiated psychosis and non-psychosis groups. The tool had good internal consistency (Cronbach’s α=0.91), and a cut-off score ≥4 yielded high sensitivity (90%) and specificity (100%). The method of tool development supports face and content validity. Criterion validity was not supported. Conclusions: Preliminary investigation of the tool’s psychometric properties is positive, although further investigation is required. The tool is accessible to adults with mild to moderate ID and can be completed in 15-30 minutes. The GPS-ID is not a diagnostic tool, therefore any adult exceeding the cut-off score of ≥4 should receive further assessment.
Resumo:
Background: The nitration of tyrosine residues in proteins is associated with nitrosative stress, resulting in the formation of 3-nitrotyrosine (3-NT). 3-NT levels in biological samples have been associated with numerous physiological and pathological conditions. For this reason, several attempts have been made in order to develop methods that accurately quantify 3-NT in biological samples. Regarding chromatographic methods, they seem to be very accurate, showing very good sensibility and specificity. However, accurate quantification of this molecule, which is present at very low concentrations both at physiological and pathological states, is always a complex task and a target of intense research. Objectives: We aimed to develop a simple, rapid, low-cost and sensitive 3-NT quantification method for use in medical laboratories as an additional tool for diagnosis and/or treatment monitoring of a wide range of pathologies. We also aimed to evaluate the performance of the HPLC-based method developed here in a wide range of biological matrices. Material and methods: All experiments were performed on a Hitachi LaChrom Elite® HPLC system and separation was carried out using a Lichrocart® 250-4 Lichrospher 100 RP-18 (5μm) column. The method was further validated according to ICH guidelines. The biological matrices tested were serum, whole blood, urine, B16 F-10 melanoma cell line, growth medium conditioned with the same cell line, bacterial and yeast suspensions. Results: From all the protocols tested, the best results were obtained using 0.5% CH3COOH:MeOH:H2O (15:15:70) as the mobile phase, with detection at wavelengths 215, 276 and 356 nm, at 25ºC, and using a flow rate of 1 mL/min. By using this protocol, it was possible to obtain a linear calibration curve (correlation coefficient = 1), limits of detection and quantification in the order of ng/mL, and a short analysis time (<15 minutes per sample). Additionally, the developed protocol allowed the successful detection and quantification of 3-NT in all biological matrices tested, with detection at 356 nm. Conclusion: The method described in this study, which was successfully developed and validated for 3-NT quantification, is simple, cheap and fast, rendering it suitable for analysis in a wide range of biological matrices.
Resumo:
The poor heating efficiency of the most reported magnetic nanoparticles (MNPs), allied to the lack of comprehensive biocompatibility and haemodynamic studies, hampers the spread of multifunctional nanoparticles as the next generation of therapeutic bio-agents in medicine. The present work reports the synthesis and characterization, with special focus on biological/toxicological compatibility, of superparamagnetic nanoparticles with diameter around 18 nm, suitable for theranostic applications (i.e. simultaneous diagnosis and therapy of cancer). Envisioning more insights into the complex nanoparticle-red blood cells (RBCs) membrane interaction, the deformability of the human RBCs in contact with magnetic nanoparticles (MNPs) was assessed for the first time with a microfluidic extensional approach, and used as an indicator of haematological disorders in comparison with a conventional haematological test, i.e. the haemolysis analysis. Microfluidic results highlight the potential of this microfluidic tool over traditional haemolysis analysis, by detecting small increments in the rigidity of the blood cells, when traditional haemotoxicology analysis showed no significant alteration (haemolysis rates lower than 2 %). The detected rigidity has been predicted to be due to the wrapping of small MNPs by the bilayer membrane of the RBCs, which is directly related to MNPs size, shape and composition. The proposed microfluidic tool adds a new dimension into the field of nanomedicine, allowing to be applied as a highsensitivity technique capable of bringing a better understanding of the biological impact of nanoparticles developed for clinical applications.