938 resultados para Automatic mortar
Resumo:
In the last few years the number of systems and devices that use voice based interaction has grown significantly. For a continued use of these systems the interface must be reliable and pleasant in order to provide an optimal user experience. However there are currently very few studies that try to evaluate how good is a voice when the application is a speech based interface. In this paper we present a new automatic voice pleasantness classification system based on prosodic and acoustic patterns of voice preference. Our study is based on a multi-language database composed by female voices. In the objective performance evaluation the system achieved a 7.3% error rate.
Resumo:
BACKGROUND: Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. METHOD: The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. RESULTS: The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.
Resumo:
Research Project submited as partial fulfilment for the Master Degree in Statistics and Information Management
Resumo:
Materials Science Forum Vols. 730-732 (2013) pp 617-622
Resumo:
The dissertation presented for obtaining the Master’s Degree in Electrical Engineering and Computer Science, at Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
2nd Historic Mortars Conference - HMC 2010 and RILEM TC 203-RHM Final Workshop, Prague, September 2010
Resumo:
Eradication of code smells is often pointed out as a way to improve readability, extensibility and design in existing software. However, code smell detection remains time consuming and error-prone, partly due to the inherent subjectivity of the detection processes presently available. In view of mitigating the subjectivity problem, this dissertation presents a tool that automates a technique for the detection and assessment of code smells in Java source code, developed as an Eclipse plugin. The technique is based upon a Binary Logistic Regression model that uses complexity metrics as independent variables and is calibrated by expert‟s knowledge. An overview of the technique is provided, the tool is described and validated by an example case study.
Resumo:
Retinal ultra-wide field of view images (fundus images) provides the visu-alization of a large part of the retina though, artifacts may appear in those images. Eyelashes and eyelids often cover the clinical region of interest and worse, eye-lashes can be mistaken with arteries and/or veins when those images are put through automatic diagnosis or segmentation software creating, in those cases, the appearance of false positives results. Correcting this problem, the first step in the development of qualified auto-matic diseases diagnosis programs can be done and in that way the development of an objective tool to assess diseases eradicating the human error from those processes can also be achieved. In this work the development of a tool that automatically delimitates the clinical region of interest is proposed by retrieving features from the images that will be analyzed by an automatic classifier. This automatic classifier will evaluate the information and will decide which part of the image is of interest and which part contains artifacts. The results were validated by implementing a software in C# language and validated through a statistical analysis. From those results it was confirmed that the methodology presented is capable of detecting artifacts and selecting the clin-ical region of interest in fundus images of the retina.
Resumo:
The extraction of relevant terms from texts is an extensively researched task in Text- Mining. Relevant terms have been applied in areas such as Information Retrieval or document clustering and classification. However, relevance has a rather fuzzy nature since the classification of some terms as relevant or not relevant is not consensual. For instance, while words such as "president" and "republic" are generally considered relevant by human evaluators, and words like "the" and "or" are not, terms such as "read" and "finish" gather no consensus about their semantic and informativeness. Concepts, on the other hand, have a less fuzzy nature. Therefore, instead of deciding on the relevance of a term during the extraction phase, as most extractors do, I propose to first extract, from texts, what I have called generic concepts (all concepts) and postpone the decision about relevance for downstream applications, accordingly to their needs. For instance, a keyword extractor may assume that the most relevant keywords are the most frequent concepts on the documents. Moreover, most statistical extractors are incapable of extracting single-word and multi-word expressions using the same methodology. These factors led to the development of the ConceptExtractor, a statistical and language-independent methodology which is explained in Part I of this thesis. In Part II, I will show that the automatic extraction of concepts has great applicability. For instance, for the extraction of keywords from documents, using the Tf-Idf metric only on concepts yields better results than using Tf-Idf without concepts, specially for multi-words. In addition, since concepts can be semantically related to other concepts, this allows us to build implicit document descriptors. These applications led to published work. Finally, I will present some work that, although not published yet, is briefly discussed in this document.
Resumo:
Earthen plastering mortars are becoming recognized as highly eco-efficient. The assessment of their technical properties needs to be standardized but only the German standard DIN 18947 exists for the moment. An extended experimental campaign was developed in order to assess multiple properties of a ready-mixed earth plastering mortar and also to increase scientific knowledge of the influence of test procedures on those properties. The experimental campaign showed that some aspects related to the equipment, type of samples and sample preparation can be very important, while others seemed to have less influence on the results and the classification of mortars. It also showed that some complementary tests can easily be performed and considered together with the standardized ones, while others may need to be improved. The plaster satisfied the requirements of the existing German standard but, most importantly, it seemed adequate for application as rehabilitation plaster on historic and modern masonry buildings. Apart from their aesthetic aspect, the contribution of earthen plasters to eco-efficiency and particularly to hygrometric indoor comfort should be highlighted.
Resumo:
In cataract surgery, the eye’s natural lens is removed because it has gone opaque and doesn’t allow clear vision any longer. To maintain the eye’s optical power, a new artificial lens must be inserted. Called Intraocular Lens (IOL), it needs to be modelled in order to have the correct refractive power to substitute the natural lens. Calculating the refractive power of this substitution lens requires precise anterior eye chamber measurements. An interferometry equipment, the AC Master from Zeiss Meditec, AG, was in use for half a year to perform these measurements. A Low Coherence Interferometry (LCI) measurement beam is aligned with the eye’s optical axis, for precise measurements of anterior eye chamber distances. The eye follows a fixation target in order to make the visual axis align with the optical axis. Performance problems occurred, however, at this step. Therefore, there was a necessity to develop a new procedure that ensures better alignment between the eye’s visual and optical axes, allowing a more user friendly and versatile procedure, and eventually automatizing the whole process. With this instrument, the alignment between the eye’s optical and visual axes is detected when Purkinje reflections I and III are overlapped, as the eye follows a fixation target. In this project, image analysis is used to detect these Purkinje reflections’ positions, eventually automatically detecting when they overlap. Automatic detection of the third Purkinje reflection of an eye following a fixation target is possible with some restrictions. Each pair of detected third Purkinje reflections is used in automatically calculating an acceptable starting position for the fixation target, required for precise measurements of anterior eye chamber distances.
Resumo:
Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.
Resumo:
Premature degradation of ordinary Portland cement (OPC) concrete infrastructures is a current and serious problem with overwhelming costs amounting to several trillion dollars. The use of concrete surface treatments with waterproofing materials to prevent the access of aggressive substances is an important way of enhancing concrete durability. The most common surface treatments use polymeric resins based on epoxy, silicone (siloxane), acrylics, polyurethanes or polymethacrylate. However, epoxy resins have low resistance to ultraviolet radiation while polyurethanes are sensitive to high alkalinity environments. Geopolymers constitute a group of materials with high resistance to chemical attack that could also be used for coating of concrete infrastructures exposed to harsh chemical environments. This article presents results of an experimental investigation on the resistance to chemical attack (by sulfuric and nitric acid) of several materials: OPC concrete, high performance concrete (HPC), epoxy resin, acrylic painting and a fly ash based geopolymeric mortar. Three types of acids, each with high concentrations of 10%, 20% and 30%, were used to simulate long term degradation by chemical attack. The results show that the epoxy resin had the best resistance to chemical attack, irrespective of the acid type and acid concentration.
Resumo:
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.