26 resultados para floatation techniques
Resumo:
Recent studies of mobile Web trends show the continued explosion of mobile-friend content. However, the wide number and heterogeneity of mobile devices poses several challenges for Web programmers, who want automatic delivery of context and adaptation of the content to mobile devices. Hence, the device detection phase assumes an important role in this process. In this chapter, the authors compare the most used approaches for mobile device detection. Based on this study, they present an architecture for detecting and delivering uniform m-Learning content to students in a Higher School. The authors focus mainly on the XML device capabilities repository and on the REST API Web Service for dealing with device data. In the former, the authors detail the respective capabilities schema and present a new caching approach. In the latter, they present an extension of the current API for dealing with it. Finally, the authors validate their approach by presenting the overall data and statistics collected through the Google Analytics service, in order to better understand the adherence to the mobile Web interface, its evolution over time, and the main weaknesses.
Resumo:
Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.
Resumo:
More than ever, there is an increase of the number of decision support methods and computer aided diagnostic systems applied to various areas of medicine. In breast cancer research, many works have been done in order to reduce false-positives when used as a double reading method. In this study, we aimed to present a set of data mining techniques that were applied to approach a decision support system in the area of breast cancer diagnosis. This method is geared to assist clinical practice in identifying mammographic findings such as microcalcifications, masses and even normal tissues, in order to avoid misdiagnosis. In this work a reliable database was used, with 410 images from about 115 patients, containing previous reviews performed by radiologists as microcalcifications, masses and also normal tissue findings. Throughout this work, two feature extraction techniques were used: the gray level co-occurrence matrix and the gray level run length matrix. For classification purposes, we considered various scenarios according to different distinct patterns of injuries and several classifiers in order to distinguish the best performance in each case described. The many classifiers used were Naïve Bayes, Support Vector Machines, k-nearest Neighbors and Decision Trees (J48 and Random Forests). The results in distinguishing mammographic findings revealed great percentages of PPV and very good accuracy values. Furthermore, it also presented other related results of classification of breast density and BI-RADS® scale. The best predictive method found for all tested groups was the Random Forest classifier, and the best performance has been achieved through the distinction of microcalcifications. The conclusions based on the several tested scenarios represent a new perspective in breast cancer diagnosis using data mining techniques.
Resumo:
Recently, operational matrices were adapted for solving several kinds of fractional differential equations (FDEs). The use of numerical techniques in conjunction with operational matrices of some orthogonal polynomials, for the solution of FDEs on finite and infinite intervals, produced highly accurate solutions for such equations. This article discusses spectral techniques based on operational matrices of fractional derivatives and integrals for solving several kinds of linear and nonlinear FDEs. More precisely, we present the operational matrices of fractional derivatives and integrals, for several polynomials on bounded domains, such as the Legendre, Chebyshev, Jacobi and Bernstein polynomials, and we use them with different spectral techniques for solving the aforementioned equations on bounded domains. The operational matrices of fractional derivatives and integrals are also presented for orthogonal Laguerre and modified generalized Laguerre polynomials, and their use with numerical techniques for solving FDEs on a semi-infinite interval is discussed. Several examples are presented to illustrate the numerical and theoretical properties of various spectral techniques for solving FDEs on finite and semi-infinite intervals.
Resumo:
The integrity of multi-component structures is usually determined by their unions. Adhesive-bonding is often used over traditional methods because of the reduction of stress concentrations, reduced weight penalty, and easy manufacturing. Commercial adhesives range from strong and brittle (e.g., Araldite® AV138) to less strong and ductile (e.g., Araldite® 2015). A new family of polyurethane adhesives combines high strength and ductility (e.g., Sikaforce® 7888). In this work, the performance of the three above-mentioned adhesives was tested in single lap joints with varying values of overlap length (LO). The experimental work carried out is accompanied by a detailed numerical analysis by finite elements, either based on cohesive zone models (CZM) or the extended finite element method (XFEM). This procedure enabled detailing the performance of these predictive techniques applied to bonded joints. Moreover, it was possible to evaluate which family of adhesives is more suited for each joint geometry. CZM revealed to be highly accurate, except for largely ductile adhesives, although this could be circumvented with a different cohesive law. XFEM is not the most suited technique for mixed-mode damage growth, but a rough prediction was achieved.
Resumo:
Adhesive bonding is an excellent alternative to traditional joining techniques such as welding, mechanical fastening or riveting. However, there are many factors that have to be accounted for during joint design to accurately predict the joint strength. One of these is the adhesive layer thickness (tA). Most of the results are for epoxy structural adhesives, tailored to perform best with small values of tA, and these show that the lap joint strength decreases with increase of tA (the optimum joint strength is usually obtained with tA values between 0.1 and 0.2 mm). Recently, polyurethane adhesives were made available in the market, designed to perform with larger tA values, and whose fracture behaviour is still not studied. In this work, the effect of tA on the tensile fracture toughness (View the MathML source) of a bonded joint is studied, considering a novel high strength and ductile polyurethane adhesive for the automotive industry. This work consists on the fracture characterization of the bond by a conventional and the J-integral techniques, which accurately account for root rotation effects. An optical measurement method is used for the evaluation of crack tip opening (δn) and adherends rotation at the crack tip (θo) during the test, supported by a Matlab® sub-routine for the automated extraction of these parameters. As output of this work, fracture data is provided in traction for the selected adhesive, enabling the subsequent strength prediction of bonded joints.
Resumo:
The use of adhesive joints has increased in recent decades due to its competitive features compared with traditional methods. This work aims to estimate the tensile critical strain energy release rate (GIC) of adhesive joints by the Double-Cantilever Beam (DCB) test. The J-integral is used since it enables obtaining the tensile Cohesive Zone Model (CZM) law. An optical measuring method was developed for assessing the crack tip opening (δn) and adherends rotation (θo). The proposed CZM laws were best approximated by a triangular shape for the brittle adhesive and a trapezoidal shape for the two ductile adhesives.
Resumo:
Background Musicians are frequently affected by playing-related musculoskeletal disorders (PRMD). Common solutions used by Western medicine to treat musculoskeletal pain include rehabilitation programs and drugs, but their results are sometimes disappointing. Objective To study the effects of self-administered exercises based on Tuina techniques on the pain intensity caused by PRMD of professional orchestra musicians, using numeric visual scale (NVS). Design, setting, participants and interventions We performed a prospective, controlled, single-blinded, randomized study with musicians suffering from PRMD. Participating musicians were randomly distributed into the experimental (n = 39) and the control (n = 30) groups. After an individual diagnostic assessment, specific Tuina self-administered exercises were developed and taught to the participants. Musicians were instructed to repeat the exercises every day for 3 weeks. Main outcome measures Pain intensity was measured by NVS before the intervention and after 1, 3, 5, 10, 15 and 20 d of treatment. The procedure was the same for the control group, however the Tuina exercises were executed in points away from the commonly-used acupuncture points. Results In the treatment group, but not the control group, pain intensity was significantly reduced on days 1, 3, 5, 10, 15 and 20. Conclusion The results obtained are consistent with the hypothesis that self-administered exercises based on Tuina techniques could help professional musicians controlling the pain caused by PRMD. Although our results are very promising, further studies are needed employing a larger sample size and double blinding designs.
Resumo:
BACKGROUND: Musicians are a prone group to suffer from working-related musculoskeletal disorder (WRMD). Conventional solutions to control musculoskeletal pain include pharmacological treatment and rehabilitation programs but their efficiency is sometimes disappointing. OBJECTIVE: The aim of this research is to study the immediate effects of Tuina techniques on WRMD of professional orchestra musicians from the north of Portugal. DESIGN, SETTING, PARTICIPANTS AND INTERVENTIONS: We performed a prospective, controlled, single-blinded, randomized study. Professional orchestra musicians with a diagnosis of WRMD were randomly distributed into the experimental group (n=39) and the control group (n=30). During an individual interview, Chinese diagnosis took place and treatment points were chosen. Real acupoints were treated by Tuina techniques into the experimental group and non-specific skin points were treated into the control group. Pain was measured by verbal numerical scale before and immediately after intervention. RESULTS: After one treatment session, pain was reduced in 91.8% of the cases for the experimental group and 7.9% for the control group. CONCLUSION: Although results showed that Tuina techniques are effectively reducing WRMD in professional orchestra musicians of the north of Portugal, further investigations with stronger measurements, double-blinding designs and bigger simple sizes are needed.
Resumo:
The Azores archipelago is a zone with a vast cultural heritage, presenting a building stock mainly constructed in traditional stone masonry. It is known that this type of construction exhibits poor behaviour under seismic excitations; however it is extensively used in seismic prone areas, such as this case. The 9th of July of 1998 earthquake was the last seismic event in the islands, leaving many traditional stone constructions severely damaged or totally destroyed. This scenario led to an effort by the local government of improving the seismic resistance of these constructions, with the application of several reinforcement techniques. This work aims to study some of the most used reinforcement schemes after the 1998 earthquake, and to assess their effectiveness in the mitigation of the construction’s seismic vulnerability. A brief evaluation of the cost versus benefit of these retrofitting techniques is also made, seeking to identify those that are most suitable for each building typology. Thus, it was sought to analyze the case of real structures with different geometrical and physical characteristics, by establishing a comparison between the seismic performance of reinforced and non-reinforced structures. The first section contains the analysis of a total of six reinforcement scenarios for each building chosen. Using the recorded 1998 earthquake accelerograms, a linear time-history analysis was performed for each reinforcement scenario. A comparison was then established between the maximum displacements, inter-storey drift and maximum stress obtained, in order to evaluate the global seismic response of each reinforced structure. In the second part of the work, the examination of the performance obtained in the previous section, in relation to the cost of implementing each reinforcement technique, allowed to draw conclusions concerning the viability of implementing each reinforcement method, based on the book value of the buildings in study.