979 resultados para Piecewise linear techniques
Resumo:
Given the limitations of different types of remote sensing images, automated land-cover classifications of the Amazon várzea may yield poor accuracy indexes. One way to improve accuracy is through the combination of images from different sensors, by either image fusion or multi-sensor classifications. Therefore, the objective of this study was to determine which classification method is more efficient in improving land cover classification accuracies for the Amazon várzea and similar wetland environments - (a) synthetically fused optical and SAR images or (b) multi-sensor classification of paired SAR and optical images. Land cover classifications based on images from a single sensor (Landsat TM or Radarsat-2) are compared with multi-sensor and image fusion classifications. Object-based image analyses (OBIA) and the J.48 data-mining algorithm were used for automated classification, and classification accuracies were assessed using the kappa index of agreement and the recently proposed allocation and quantity disagreement measures. Overall, optical-based classifications had better accuracy than SAR-based classifications. Once both datasets were combined using the multi-sensor approach, there was a 2% decrease in allocation disagreement, as the method was able to overcome part of the limitations present in both images. Accuracy decreased when image fusion methods were used, however. We therefore concluded that the multi-sensor classification method is more appropriate for classifying land cover in the Amazon várzea.
Resumo:
ABSTRACT The spatial distribution of forest biomass in the Amazon is heterogeneous with a temporal and spatial variation, especially in relation to the different vegetation types of this biome. Biomass estimated in this region varies significantly depending on the applied approach and the data set used for modeling it. In this context, this study aimed to evaluate three different geostatistical techniques to estimate the spatial distribution of aboveground biomass (AGB). The selected techniques were: 1) ordinary least-squares regression (OLS), 2) geographically weighted regression (GWR) and, 3) geographically weighted regression - kriging (GWR-K). These techniques were applied to the same field dataset, using the same environmental variables derived from cartographic information and high-resolution remote sensing data (RapidEye). This study was developed in the Amazon rainforest from Sucumbíos - Ecuador. The results of this study showed that the GWR-K, a hybrid technique, provided statistically satisfactory estimates with the lowest prediction error compared to the other two techniques. Furthermore, we observed that 75% of the AGB was explained by the combination of remote sensing data and environmental variables, where the forest types are the most important variable for estimating AGB. It should be noted that while the use of high-resolution images significantly improves the estimation of the spatial distribution of AGB, the processing of this information requires high computational demand.
Resumo:
Commercial stents, especially metallic ones, present several disadvantages, and this gives rise to the necessity of producing or coating stents with different materials, like natural polymers, in order to improve their biocompatibility and minimize the disadvantages of metallic ones. This review paper discusses some applications of natural-based polymers in stents, namely polylactic acid (PLA) for stent development and chitosan for biocompatible coatings of stents . Furthermore, some effective stent functionalization techniques will be discussed, namely Layer by Layer (LBL) technique.
Resumo:
OBJECTIVE: The study presents the Brazilian norms for 240 new stimuli from International Affective Picture System (IAPS), a database of affective images widely used in research, compared to the North-American normative ratings. METHODS: The participants were 448 Brazilian university students from several courses (269 women and 179 men) with mean age of 24.2 (SD = 7.8), that evaluated the IAPS pictures in the valence, arousal and dominance dimensions by the Self-Assessment Manikin (SAM) scales. Data were compared across the populations by Pearson linear correlation and Student's t-tests. RESULTS: Correlations were highly significant for all dimensions; however, Brazilians' averages for arousal were higher than North-Americans'. CONCLUSIONS: The results show stability in relation to the first part of the Brazilian standardization and they are also consistent with the North-American standards, despite minor differences relating to interpretation of the arousal dimension, demonstrating that IAPS is a reliable instrument for experimental studies in the Brazilian population.
Resumo:
Enzymatic polymerization of aniline was first performed in lignosulfonate (LGS) template system. High-redox-potential catalyst laccase, isolated from Aspergillus, was used as a biocatalyst in the synthesis of conducting polyaniline/lignosulfonate (PANI-ES-LGS) complex using atmospheric oxygen as the oxidizing agent. The linear templates (LGS), also serving as the dopants, could facilitate the directional alignment of the monomer and improve the solubility of the conducting polymer. The process of the polymerization was monitored using UV-Vis spectroscopy, by which the conditions for laccase-catalyzed synthesis of PANI-ES-LGS complex were also optimized. The structure characterizations and solubility of the complex were carried out using corresponding characterization techniques respectively. The PANI-ES-LGS suspensions obtained was used as coating for cotton with a conventional padder to explore the applications of the complex. The variable optoelectronic properties of the coated cotton were confirmed by cyclic voltammetry and color strength test. The molecular weight changes of LGS treated by laccase were also studied to discuss the mechanism of laccase catalyzed aniline polymerization in LGS template system.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Inspired by the relational algebra of data processing, this paper addresses the foundations of data analytical processing from a linear algebra perspective. The paper investigates, in particular, how aggregation operations such as cross tabulations and data cubes essential to quantitative analysis of data can be expressed solely in terms of matrix multiplication, transposition and the Khatri–Rao variant of the Kronecker product. The approach offers a basis for deriving an algebraic theory of data consolidation, handling the quantitative as well as qualitative sides of data science in a natural, elegant and typed way. It also shows potential for parallel analytical processing, as the parallelization theory of such matrix operations is well acknowledged.
Resumo:
In the trend towards tolerating hardware unreliability, accuracy is exchanged for cost savings. Running on less reliable machines, functionally correct code becomes risky and one needs to know how risk propagates so as to mitigate it. Risk estimation, however, seems to live outside the average programmer’s technical competence and core practice. In this paper we propose that program design by source-to-source transformation be risk-aware in the sense of making probabilistic faults visible and supporting equational reasoning on the probabilistic behaviour of programs caused by faults. This reasoning is carried out in a linear algebra extension to the standard, `a la Bird-Moor algebra of programming. This paper studies, in particular, the propagation of faults across standard program transformation techniques known as tupling and fusion, enabling the fault of the whole to be expressed in terms of the faults of its parts.
Resumo:
Tese de Doutoramento em Engenharia Civil.
Resumo:
This paper reports on an innovative approach to measuring intraluminal pressure in the upper gastrointestinal (GI) tract, especially monitoring GI motility and peristaltic movements. The proposed approach relies on thin-film aluminum strain gauges deposited on top of a Kapton membrane, which in turn lies on top of an SU-8 diaphragm-like structure. This structure enables the Kapton membrane to bend when pressure is applied, thereby affecting the strain gauges and effectively changing their electrical resistance. The sensor, with an area of 3.4 mm2, is fabricated using photolithography and standard microfabrication techniques (wet etching). It features a linear response (R2 = 0.9987) and an overall sensitivity of 2.6 mV mmHg−1. Additionally, its topology allows a high integration capability. The strain gauges’ responses to pressure were studied and the fabrication process optimized to achieve high sensitivity, linearity, and reproducibility. The sequential acquisition of the different signals is carried out by a microcontroller, with a 10-bit ADC and a sample rate of 250 Hz. The pressure signals are then presented in a user-friendly interface, developed using the Integrated Development Environment software, QtCreator IDE, for better visualization by physicians.
Resumo:
We report new percutaneous techniques for perforating the pulmonary valve in pulmonary atresia with intact ventricular septum, in 3 newborns who had this birth defect. There was mild to moderate hypoplastic right ventricle, a patent infundibulum, and no coronary-cavitary communications. We succeeded in all cases, and no complications related to the procedure occurred. The new coaxial radiofrequency system was easy to handle, which simplified the procedure. Two patients required an additional source of pulmonary flow (Blalock-Taussig shunt) in the first week after catheterization. All patients had a satisfactory short-term clinical evolution and will undergo recatheterization within 1 year to define the next therapeutic strategy. We conclude that this technique may be safely and efficiently performed, especially when the new coaxial radiofrequency system is used, and it may become the initial treatment of choice in select neonates with pulmonary atresia and intact ventricular septum.
Resumo:
The decision support models in intensive care units are developed to support medical staff in their decision making process. However, the optimization of these models is particularly difficult to apply due to dynamic, complex and multidisciplinary nature. Thus, there is a constant research and development of new algorithms capable of extracting knowledge from large volumes of data, in order to obtain better predictive results than the current algorithms. To test the optimization techniques a case study with real data provided by INTCare project was explored. This data is concerning to extubation cases. In this dataset, several models like Evolutionary Fuzzy Rule Learning, Lazy Learning, Decision Trees and many others were analysed in order to detect early extubation. The hydrids Decision Trees Genetic Algorithm, Supervised Classifier System and KNNAdaptive obtained the most accurate rate 93.2%, 93.1%, 92.97% respectively, thus showing their feasibility to work in a real environment.
Resumo:
OBJECTIVE: To determine technical procedures and criteria used by Brazilian physicians for measuring blood pressure and diagnosing hypertension. METHODS: A questionnaire with 5 questions about practices and behaviors regarding blood pressure measurement and the diagnosis of hypertension was sent to 25,606 physicians in all Brazilian regions through a mailing list. The responses were compared with the recommendations of a specific consensus and descriptive analysis. RESULTS: Of the 3,621 (14.1%) responses obtained, 57% were from the southeastern region of Brazil. The following items were reported: use of an aneroid device by 67.8%; use of a mercury column device by 14.6%; 11.9% of the participants never calibrated the devices; 35.7% calibrated the devices at intervals < 1 year; 85.8% measured blood pressure in 100% of the medical visits; 86.9% measured blood pressure more than once and on more than one occasion. For hypertension diagnosis, 55.7% considered the patient's age, and only 1/3 relied on consensus statements. CONCLUSION: Despite the adequate frequency of both practices, it was far from that expected, and some contradictions between the diagnostic criterion for hypertension and the number of blood pressure measurements were found. The results suggest that, to include the great majority of the medical professionals, disclosure of consensus statements and techniques for blood pressure measurement should go beyond the boundaries of medical events and specialized journals.
Resumo:
OBJECTIVE: To assess blood pressure measurement by health professionals of a public hospital in São Paulo State. METHODS: Semi-structured interviews and direct observation were performed with a verification list according to the criteria reported by Perloff et al. One hundred and five health professionals took part in the study. After measuring blood pressure, the level of concordance between the way the procedure was performed and the recommended one was assessed. RESULTS: Nurses and nurse's aides abided by 40% of the recommended procedures for adequate blood pressure measurement. The other categories of health professionals (nursing and medicine teachers, physicians, residents, and nursing students) abided by approximately 70%. CONCLUSION: Permanent educational activities aiming at standardizing blood pressure measurement should be implemented among the different categories of health professionals.
Resumo:
En este proyecto se desarrollarán algoritmos numéricos para sistemas no lineales hiperbólicos-parabólicos de ecuaciones diferenciales en derivadas parciales. Dichos sistemas tienen aplicación en propagación de ondas en ámbitos aeroespaciales y astrofísicos.Objetivos generales: 1)Desarrollo y mejora de algoritmos numéricos con la finalidad de incrementar la calidad en la simulación de propagación e interacción de ondas gasdinámicas y magnetogasdinámicas no lineales. 2)Desarrollo de códigos computacionales con la finalidad de simular flujos gasdinámicos de elevada entalpía incluyendo cambios químicos, efectos dispersivos y difusivos.3)Desarrollo de códigos computacionales con la finalidad de simular flujos magnetogasdinámicos ideales y reales.4)Aplicación de los nuevos algoritmos y códigos computacionales a la solución del flujo aerotermodinámico alrededor de cuerpos que ingresan en la atmósfera terrestre. 5)Aplicación de los nuevos algoritmos y códigos computacionales a la simulación del comportamiento dinámico no lineal de arcos magnéticos en la corona solar. 6)Desarrollo de nuevos modelos para describir el comportamiento no lineal de arcos magnéticos en la corona solar.Este proyecto presenta como objetivo principal la introducción de mejoras en algoritmos numéricos para simular la propagación e interacción de ondas no lineales en dos medios gaseosos: aquellos que no poseen carga eléctrica libre (flujos gasdinámicos) y aquellos que tienen carga eléctrica libre (flujos magnetogasdinámicos). Al mismo tiempo se desarrollarán códigos computacionales que implementen las mejoras de las técnicas numéricas.Los algoritmos numéricos se aplicarán con la finalidad de incrementar el conocimiento en tópicos de interés en la ingeniería aeroespacial como es el cálculo del flujo de calor y fuerzas aerotermodinámicas que soportan objetos que ingresan a la atmósfera terrestre y en temas de astrofísica como la propagación e interacción de ondas, tanto para la transferencia de energía como para la generación de inestabilidades en arcos magnéticos de la corona solar. Estos dos temas poseen en común las técnicas y algoritmos numéricos con los que serán tratados. Las ecuaciones gasdinámicas y magnetogasdinámicas ideales conforman sistemas hiperbólicos de ecuaciones diferenciales y pueden ser solucionados utilizando "Riemann solvers" junto con el método de volúmenes finitos (Toro 1999; Udrea 1999; LeVeque 1992 y 2005). La inclusión de efectos difusivos genera que los sistemas de ecuaciones resulten hiperbólicos-parabólicos. La contribución parabólica puede ser considerada como términos fuentes y tratada adicionalmente tanto en forma explícita como implícita (Udrea 1999; LeVeque 2005).Para analizar el flujo alrededor de cuerpos que ingresan en la atmósfera se utilizarán las ecuaciones de Navier-Stokes químicamente activas, mientras la temperatura no supere los 6000K. Para mayores temperaturas es necesario considerar efectos de ionización (Anderson, 1989). Tanto los efectos difusivos como los cambios químicos serán considerados como términos fuentes en las ecuaciones de Euler. Para tratar la propagación de ondas, transferencia de energía e inestabilidades en arcos magnéticos de la corona solar se utilizarán las ecuaciones de la magnetogasdinámica ideal y real. En este caso será también conveniente implementar términos fuente para el tratamiento de fenómenos de transporte como el flujo de calor y el de radiación. Los códigos utilizarán la técnica de volúmenes finitos, junto con esquemas "Total Variation Disminishing - TVD" sobre mallas estructuradas y no estructuradas.