949 resultados para Resolution in azimuth direction
Resumo:
PURPOSE: Acute pyelonephritis is a common condition in children, and can lead to renal scarring. The aim of this study was to analyze the progression of renal scarring with time and its impact on renal growth. MATERIALS AND METHODS: A total of 50 children who had renal scarring on dimercapto-succinic acid scan 6 months after acute pyelonephritis underwent a repeat scan 3 years later. Lesion changes were evaluated by 3 blinded observers, and were classified as no change, partial resolution or complete disappearance. Renal size at time of acute pyelonephritis and after 3 years was obtained by ultrasound, and renal growth was assessed comparing z-score for age between the 2 measures. Robust linear regression was used to identify determinants of renal growth. RESULTS: At 6 months after acute pyelonephritis 88 scars were observed in 100 renal units. No change was observed in 27%, partial resolution in 63% and complete disappearance in 9% of lesions. Overall, 72% of lesions improved. Increased number of scars was associated with high grade vesicoureteral reflux (p = 0.02). Multivariate analysis showed that the number of scars was the most important parameter leading to decreased renal growth (CI -1.05 to -0.35, p <0.001), and with 3 or more scars this finding was highly significant on univariate analysis (-1.59, CI -2.10 to -1.09, p <0.0001). CONCLUSIONS: Even 6 months after acute pyelonephritis 72% of dimercapto-succinic acid defects improved, demonstrating that some of the lesions may be not definitive. The number of scars was significantly associated with loss of renal growth at 3 years.
Resumo:
The objective of this work was to evaluate sampling density on the prediction accuracy of soil orders, with high spatial resolution, in a viticultural zone of Serra Gaúcha, Southern Brazil. A digital elevation model (DEM), a cartographic base, a conventional soil map, and the Idrisi software were used. Seven predictor variables were calculated and read along with soil classes in randomly distributed points, with sampling densities of 0.5, 1, 1.5, 2, and 4 points per hectare. Data were used to train a decision tree (Gini) and three artificial neural networks: adaptive resonance theory, fuzzy ARTMap; self‑organizing map, SOM; and multi‑layer perceptron, MLP. Estimated maps were compared with the conventional soil map to calculate omission and commission errors, overall accuracy, and quantity and allocation disagreement. The decision tree was less sensitive to sampling density and had the highest accuracy and consistence. The SOM was the less sensitive and most consistent network. The MLP had a critical minimum and showed high inconsistency, whereas fuzzy ARTMap was more sensitive and less accurate. Results indicate that sampling densities used in conventional soil surveys can serve as a reference to predict soil orders in Serra Gaúcha.
Resumo:
Small centrifugal compressors are more and more widely used in many industrialsystems because of their higher efficiency and better off-design performance comparing to piston and scroll compressors as while as higher work coefficient perstage than in axial compressors. Higher efficiency is always the aim of the designer of compressors. In the present work, the influence of four partsof a small centrifugal compressor that compresses heavy molecular weight real gas has been investigated in order to achieve higher efficiency. Two parts concern the impeller: tip clearance and the circumferential position of the splitter blade. The other two parts concern the diffuser: the pinch shape and vane shape. Computational fluid dynamics is applied in this study. The Reynolds averaged Navier-Stokes flow solver Finflo is used. The quasi-steady approach is utilized. Chien's k-e turbulence model is used to model the turbulence. A new practical real gas model is presented in this study. The real gas model is easily generated, accuracy controllable and fairly fast. The numerical results and measurements show good agreement. The influence of tip clearance on the performance of a small compressor is obvious. The pressure ratio and efficiency are decreased as the size of tip clearance is increased, while the total enthalpy rise keeps almost constant. The decrement of the pressure ratio and efficiency is larger at higher mass flow rates and smaller at lower mass flow rates. The flow angles at the inlet and outlet of the impeller are increased as the size of tip clearance is increased. The results of the detailed flow field show that leakingflow is the main reason for the performance drop. The secondary flow region becomes larger as the size of tip clearance is increased and the area of the main flow is compressed. The flow uniformity is then decreased. A detailed study shows that the leaking flow rate is higher near the exit of the impeller than that near the inlet of the impeller. Based on this phenomenon, a new partiallyshrouded impeller is used. The impeller is shrouded near the exit of the impeller. The results show that the flow field near the exit of the impeller is greatly changed by the partially shrouded impeller, and better performance is achievedthan with the unshrouded impeller. The loading distribution on the impeller blade and the flow fields in the impeller is changed by moving the splitter of the impeller in circumferential direction. Moving the splitter slightly to the suction side of the long blade can improve the performance of the compressor. The total enthalpy rise is reduced if only the leading edge of the splitter ismoved to the suction side of the long blade. The performance of the compressor is decreased if the blade is bended from the radius direction at the leading edge of the splitter. The total pressure rise and the enthalpy rise of thecompressor are increased if pinch is used at the diffuser inlet. Among the fivedifferent pinch shape configurations, at design and lower mass flow rates the efficiency of a straight line pinch is the highest, while at higher mass flow rate, the efficiency of a concave pinch is the highest. The sharp corner of the pinch is the main reason for the decrease of efficiency and should be avoided. The variation of the flow angles entering the diffuser in spanwise direction is decreased if pinch is applied. A three-dimensional low solidity twisted vaned diffuser is designed to match the flow angles entering the diffuser. The numerical results show that the pressure recovery in the twisted diffuser is higher than in a conventional low solidity vaned diffuser, which also leads to higher efficiency of the twisted diffuser. Investigation of the detailed flow fields shows that the separation at lower mass flow rate in the twisted diffuser is later than in the conventional low solidity vaned diffuser, which leads to a possible wider flow range of the twisted diffuser.
Resumo:
Quest for Orthologs (QfO) is a community effort with the goal to improve and benchmark orthology predictions. As quality assessment assumes prior knowledge on species phylogenies, we investigated the congruency between existing species trees by comparing the relationships of 147 QfO reference organisms from six Tree of Life (ToL)/species tree projects: The National Center for Biotechnology Information (NCBI) taxonomy, Opentree of Life, the sequenced species/species ToL, the 16S ribosomal RNA (rRNA) database, and trees published by Ciccarelli et al. (Ciccarelli FD, et al. 2006. Toward automatic reconstruction of a highly resolved tree of life. Science 311:1283-1287) and by Huerta-Cepas et al. (Huerta-Cepas J, Marcet-Houben M, Gabaldon T. 2014. A nested phylogenetic reconstruction approach provides scalable resolution in the eukaryotic Tree Of Life. PeerJ PrePrints 2:223) Our study reveals that each species tree suggests a different phylogeny: 87 of the 146 (60%) possible splits of a dichotomous and rooted tree are congruent, while all other splits are incongruent in at least one of the species trees. Topological differences are observed not only at deep speciation events, but also within younger clades, such as Hominidae, Rodentia, Laurasiatheria, or rosids. The evolutionary relationships of 27 archaea and bacteria are highly inconsistent. By assessing 458,108 gene trees from 65 genomes, we show that consistent species topologies are more often supported by gene phylogenies than contradicting ones. The largest concordant species tree includes 77 of the QfO reference organisms at the most. Results are summarized in the form of a consensus ToL (http://swisstree.vital-it.ch/species_tree) that can serve different benchmarking purposes.
Resumo:
This paper examines argumentative talk-in-interaction in the workplace. It focuses on counter-argumentative references, which consist of the various resources that the opponent uses to refer to the origin/source of his/her opposition, namely the confronted position and the person who expressed it. Particular attention is paid to the relationship - in terms of sequential positioning and referential extension - between reported speech, polyphony, pointing gestures and shifts in gaze direction. Data are taken from workplace management meetings that have been recorded in New Zealand by the Language in the Workplace Project.
Resumo:
Given their high sensitivity and ability to limit the field of view (FOV), surface coils are often used in magnetic resonance spectroscopy (MRS) and imaging (MRI). A major downside of surface coils is their inherent radiofrequency (RF) B1 heterogeneity across the FOV, decreasing with increasing distance from the coil and giving rise to image distortions due to non-uniform spatial responses. A robust way to compensate for B1 inhomogeneities is to employ adiabatic inversion pulses, yet these are not well adapted to all imaging sequences - including to single-shot approaches like echo planar imaging (EPI). Hybrid spatiotemporal encoding (SPEN) sequences relying on frequency-swept pulses provide another ultrafast MRI alternative, that could help solve this problem thanks to their built-in heterogeneous spatial manipulations. This study explores how this intrinsic SPEN-based spatial discrimination, could be used to compensate for the B1 inhomogeneities inherent to surface coils. Experiments carried out in both phantoms and in vivo rat brains demonstrate that, by suitably modulating the amplitude of a SPEN chirp pulse that progressively excites the spins in a direction normal to the coil, it is possible to compensate for the RF transmit inhomogeneities and thus improve sensitivity and image fidelity.
Resumo:
Snow cover is an important control in mountain environments and a shift of the snow-free period triggered by climate warming can strongly impact ecosystem dynamics. Changing snow patterns can have severe effects on alpine plant distribution and diversity. It thus becomes urgent to provide spatially explicit assessments of snow cover changes that can be incorporated into correlative or empirical species distribution models (SDMs). Here, we provide for the first time a with a lower overestimation comparison of two physically based snow distribution models (PREVAH and SnowModel) to produce snow cover maps (SCMs) at a fine spatial resolution in a mountain landscape in Austria. SCMs have been evaluated with SPOT-HRVIR images and predictions of snow water equivalent from the two models with ground measurements. Finally, SCMs of the two models have been compared under a climate warming scenario for the end of the century. The predictive performances of PREVAH and SnowModel were similar when validated with the SPOT images. However, the tendency to overestimate snow cover was slightly lower with SnowModel during the accumulation period, whereas it was lower with PREVAH during the melting period. The rate of true positives during the melting period was two times higher on average with SnowModel with a lower overestimation of snow water equivalent. Our results allow for recommending the use of SnowModel in SDMs because it better captures persisting snow patches at the end of the snow season, which is important when modelling the response of species to long-lasting snow cover and evaluating whether they might survive under climate change.
Resumo:
Biomedical research is currently facing a new type of challenge: an excess of information, both in terms of raw data from experiments and in the number of scientific publications describing their results. Mirroring the focus on data mining techniques to address the issues of structured data, there has recently been great interest in the development and application of text mining techniques to make more effective use of the knowledge contained in biomedical scientific publications, accessible only in the form of natural human language. This thesis describes research done in the broader scope of projects aiming to develop methods, tools and techniques for text mining tasks in general and for the biomedical domain in particular. The work described here involves more specifically the goal of extracting information from statements concerning relations of biomedical entities, such as protein-protein interactions. The approach taken is one using full parsing—syntactic analysis of the entire structure of sentences—and machine learning, aiming to develop reliable methods that can further be generalized to apply also to other domains. The five papers at the core of this thesis describe research on a number of distinct but related topics in text mining. In the first of these studies, we assessed the applicability of two popular general English parsers to biomedical text mining and, finding their performance limited, identified several specific challenges to accurate parsing of domain text. In a follow-up study focusing on parsing issues related to specialized domain terminology, we evaluated three lexical adaptation methods. We found that the accurate resolution of unknown words can considerably improve parsing performance and introduced a domain-adapted parser that reduced the error rate of theoriginal by 10% while also roughly halving parsing time. To establish the relative merits of parsers that differ in the applied formalisms and the representation given to their syntactic analyses, we have also developed evaluation methodology, considering different approaches to establishing comparable dependency-based evaluation results. We introduced a methodology for creating highly accurate conversions between different parse representations, demonstrating the feasibility of unification of idiverse syntactic schemes under a shared, application-oriented representation. In addition to allowing formalism-neutral evaluation, we argue that such unification can also increase the value of parsers for domain text mining. As a further step in this direction, we analysed the characteristics of publicly available biomedical corpora annotated for protein-protein interactions and created tools for converting them into a shared form, thus contributing also to the unification of text mining resources. The introduced unified corpora allowed us to perform a task-oriented comparative evaluation of biomedical text mining corpora. This evaluation established clear limits on the comparability of results for text mining methods evaluated on different resources, prompting further efforts toward standardization. To support this and other research, we have also designed and annotated BioInfer, the first domain corpus of its size combining annotation of syntax and biomedical entities with a detailed annotation of their relationships. The corpus represents a major design and development effort of the research group, with manual annotation that identifies over 6000 entities, 2500 relationships and 28,000 syntactic dependencies in 1100 sentences. In addition to combining these key annotations for a single set of sentences, BioInfer was also the first domain resource to introduce a representation of entity relations that is supported by ontologies and able to capture complex, structured relationships. Part I of this thesis presents a summary of this research in the broader context of a text mining system, and Part II contains reprints of the five included publications.
Resumo:
Discussions about the culture-economy articulation have occurred largely within theconfines of economic geography. In addition, much attention has been diverted intocaricaturized discussions over the demise of political economy or the invalidity ofculturalist arguments. Moving the argument from the inquiry on the ¿nature¿ of theeconomy itself to the transformation of the role of culture and economy inunderstanding the production of the urban form from an urban political economy (UPE)this paper focuses on how the challenges posed by the cultural turn have enabled urbanpolitical economy to participate constructively in interdisciplinary efforts to reorientpolitical economy in the direction of a critical cultural political economy.
Resumo:
This paper compares different times of museums considering the situation of the past two decades. It reflects on the upswing and down, knowing that partake of the latter and we are in a deep crisis and unsustainable policies dictating contradictory. This situation raises the museums with their own project and creative professionals can better overcome difficulties because they do not suffer crisis of ideas. Some paradoxes are evident as the case of Greece, the place where the museums are repositories of highly relevant cultural values and the same institutions that have enhanced their improvement in the years of the upswing, they currently require, cuts that put them in a position risk. Similarly we can see that policies are applied to thin the identity of museums since they rely on adjustments that do not study each particular case. Alternatively there is the creativity and efforts of managers and professionals in general and cooperative work. It gives details of some of our participatory projects that go in this direction and to be successfully applied
Resumo:
The Roll-to-Roll process makes it possible to print electronic products continuously onto a uniform substrate. Printing components on flexible surfaces can bring down the costs of simple electronic devices such as RFID tags, antennas and transistors. The possibility of quickly printing flexible electronic components opens up a wide array of novel products previously too expensive to produce on a large scale. Several different printing methods can be used in Roll-to-Roll printing, such as gravure, spray, offset, flexographic and others. Most of the methods can also be mixed in one production line. Most of them still require years of research to reach a significant commercial level. The research for this thesis was carried out at the Konkuk University Flexible Display Research Center (KU-FDRC) in Seoul, Korea. A system using Roll-to-Roll printing requires that the motion of the web can be controlled in every direction in order to align different layers of ink properly. Between printers the ink is dried with hot air. The effects of thermal expansion on the tension of the web are studied in this work, and a mathematical model was constructed on Matlab and Simulink. Simulations and experiments lead to the conclusion that the thermal expansion of the web has a great influence on the tension of the web. Also, experimental evidence was gained that the particular printing machine used for these experiments at KU-FDRC may have a problem in controlling the speeds of the cylinders which pull the web.
Resumo:
The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.
Resumo:
Laser additive manufacturing (LAM), known also as 3D printing, has gained a lot of interest in past recent years within various industries, such as medical and aerospace industries. LAM enables fabrication of complex 3D geometries by melting metal powder layer by layer with laser beam. Research in laser additive manufacturing has been focused in development of new materials and new applications in past 10 years. Since this technology is on cutting edge, efficiency of manufacturing process is in center role of research of this industry. Aim of this thesis is to characterize methods for process efficiency improvements in laser additive manufacturing. The aim is also to clarify the effect of process parameters to the stability of the process and in microstructure of manufactured pieces. Experimental tests of this thesis were made with various process parameters and their effect on build pieces has been studied, when additive manufacturing was performed with a modified research machine representing EOSINT M-series and with EOS EOSINT M280. Material used was stainless steel 17-4 PH. Also, some of the methods for process efficiency improvements were tested. Literature review of this thesis presents basics of laser additive manufacturing, methods for improve the process efficiency and laser beam – material- interaction. It was observed that there are only few public studies about process efficiency of laser additive manufacturing of stainless steel. According to literature, it is possible to improve process efficiency with higher power lasers and thicker layer thicknesses. The process efficiency improvement is possible if the effect of process parameter changes in manufactured pieces is known. According to experiments carried out in this thesis, it was concluded that process parameters have major role in single track formation in laser additive manufacturing. Rough estimation equations were created to describe the effect of input parameters to output parameters. The experimental results showed that the WDA (width-depth-area of cross-sections of single track) is correlating exponentially with energy density input. The energy density input is combination of the input parameters of laser power, laser beam spot diameter and scan speed. The use of skin-core technique enables improvement of process efficiency as the core of the part is manufactured with higher laser power and thicker layer thickness and the skin with lower laser power and thinner layer thickness in order to maintain high resolution. In this technique the interface between skin and core must have overlapping in order to achieve full dense parts. It was also noticed in this thesis that keyhole can be formed in LAM process. It was noticed that the threshold intensity value of 106 W/cm2 was exceeded during the tests. This means that in these tests the keyhole formation was possible.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Leaves and fruits from 63 Stryphnodendron adstringens trees were sampled in the Rio Preto State Park to analyze allozyme segregation, tissue specific expression of allozyme loci, and their genetic parameters. The enzyme systems ADH, EST, ACP, PGM, PGI, GDH, G6PDH, GOT, IDH, LAP, MDH, PER and SKDH were assessed by means of starch-gel electrophoresis. The polymorphic systems PGI, IDH, MDH and GOT demonstrated a dimeric quaternary structure, while EST and PER were monomeric. The total expected genetic diversity (H E) for leaves and seeds were 0.325 and 0.244 respectively. The effective number of alleles per locus (A E) was 1.58 in leaves and 1.42 in seeds. The values of H E and A E observed in S. adstringens were comparatively higher than the average values seen in allozyme studies of other woody plants. The values of the fixation indices for the population, considering leaves (f = 0.070) and seeds (f = 0.107), were not significant. The high values of genetic diversity and of effective number of alleles per locus, as well as the non-significant fixation index and the adjustments of the Hardy-Weinberg proportions between generations for the pgi-1, mdh-2 and idh-1 loci, indicated random mating in this population. The enzyme systems EST and PER demonstrated their best resolution in leaf tissues, while the MDH, IDH, PGI and GOT systems demonstrated their best resolution in seed tissues.