928 resultados para Hypergraph Partitioning
Resumo:
The focus of this paper is to develop computationally efficient mathematical morphology operators on hypergraphs. To this aim we consider lattice structures on hypergraphs on which we build morphological operators. We develop a pair of dual adjunctions between the vertex set and the hyperedge set of a hypergraph , by defining a vertex-hyperedge correspondence. This allows us to recover the classical notion of a dilation/erosion of a subset of vertices and to extend it to subhypergraphs of . This paper also studies the concept of morphological adjunction on hypergraphs for which both the input and the output are hypergraphs
Resumo:
The focus of this article is to develop computationally efficient mathematical morphology operators on hypergraphs. To this aim we consider lattice structures on hypergraphs on which we build morphological operators. We develop a pair of dual adjunctions between the vertex set and the hyper edge set of a hypergraph H, by defining a vertex-hyperedge correspondence. This allows us to recover the classical notion of a dilation/erosion of a subset of vertices and to extend it to subhypergraphs of H. Afterward, we propose several new openings, closings, granulometries and alternate sequential filters acting (i) on the subsets of the vertex and hyperedge set of H and (ii) on the subhypergraphs of a hypergraph
Resumo:
In this article, techniques have been presented for faster evolution of wavelet lifting coefficients for fingerprint image compression (FIC). In addition to increasing the computational speed by 81.35%, the coefficients performed much better than the reported coefficients in literature. Generally, full-size images are used for evolving wavelet coefficients, which is time consuming. To overcome this, in this work, wavelets were evolved with resized, cropped, resized-average and cropped-average images. On comparing the peak- signal-to-noise-ratios (PSNR) offered by the evolved wavelets, it was found that the cropped images excelled the resized images and is in par with the results reported till date. Wavelet lifting coefficients evolved from an average of four 256 256 centre-cropped images took less than 1/5th the evolution time reported in literature. It produced an improvement of 1.009 dB in average PSNR. Improvement in average PSNR was observed for other compression ratios (CR) and degraded images as well. The proposed technique gave better PSNR for various bit rates, with set partitioning in hierarchical trees (SPIHT) coder. These coefficients performed well with other fingerprint databases as well.
Resumo:
In this paper, a novel fast method for modeling mammograms by deterministic fractal coding approach to detect the presence of microcalcifications, which are early signs of breast cancer, is presented. The modeled mammogram obtained using fractal encoding method is visually similar to the original image containing microcalcifications, and therefore, when it is taken out from the original mammogram, the presence of microcalcifications can be enhanced. The limitation of fractal image modeling is the tremendous time required for encoding. In the present work, instead of searching for a matching domain in the entire domain pool of the image, three methods based on mean and variance, dynamic range of the image blocks, and mass center features are used. This reduced the encoding time by a factor of 3, 89, and 13, respectively, in the three methods with respect to the conventional fractal image coding method with quad tree partitioning. The mammograms obtained from The Mammographic Image Analysis Society database (ground truth available) gave a total detection score of 87.6%, 87.6%, 90.5%, and 87.6%, for the conventional and the proposed three methods, respectively.
Resumo:
As the number of resources on the web exceeds by far the number of documents one can track, it becomes increasingly difficult to remain up to date on ones own areas of interest. The problem becomes more severe with the increasing fraction of multimedia data, from which it is difficult to extract some conceptual description of their contents. One way to overcome this problem are social bookmark tools, which are rapidly emerging on the web. In such systems, users are setting up lightweight conceptual structures called folksonomies, and overcome thus the knowledge acquisition bottleneck. As more and more people participate in the effort, the use of a common vocabulary becomes more and more stable. We present an approach for discovering topic-specific trends within folksonomies. It is based on a differential adaptation of the PageRank algorithm to the triadic hypergraph structure of a folksonomy. The approach allows for any kind of data, as it does not rely on the internal structure of the documents. In particular, this allows to consider different data types in the same analysis step. We run experiments on a large-scale real-world snapshot of a social bookmarking system.
Resumo:
'The problem of the graphic artist' is a small example of applying elementary mathematics (divisibility of natural numbers) to a real problem which we ourselves have actually experienced. It deals with the possibilities for partitioning a sheet of paper into strips. In this contribution we report on a teaching unit in grade 6 as well as on informal tests with students in school and university. Finally we analyse this example methodologically, summarise our observations with pupils and students, and draw some didactical conclusions.
Resumo:
To increase the organic matter (OM) content in the soil is one main goal in arable soil management. The adoption of tillage systems with reduced tillage depth and/or frequency (reduced tillage) or of no-tillage was found to increase the concentration of soil OM compared to conventional tillage (CT; ploughing to 20-30 cm). However, the underlying processes are not yet clear and are discussed contradictorily. So far, few investigations were conducted on tillage systems with a shallow tillage depth (minimum tillage = MT; maximum tillage depth of 10 cm). A better understanding of the interactions between MT implementation and changes in OM transformation in soils is essential in order to evaluate the possible contribution of MT to a sustainable management of arable soils. The objectives of the present thesis were (i) to compare OM concentrations, microbial biomass, water-stable aggregates, and particulate OM (POM) between CT and MT soils, (ii) to estimate the temporal variability of water-stable aggregate size classes occurring in the field and the dynamics of macroaggregate (>250 µm) formation and disruption under controlled conditions, (iii) to investigate whether a lower disruption or a higher formation rate accounts for a higher occurrence of macroaggregates under MT compared to CT, (iv) to determine which fraction is the major agent for storing the surplus of OM found under MT compared to CT, and (v) to observe the early OM transformation after residue incorporation in different tillage systems simulated. Two experimental sites (Garte-Süd and Hohes Feld) near Göttingen, Germany, were investigated. Soil type of both sites was a Haplic Luvisol. Since about 40 years, both sites receive MT by a rotary harrow (to 5-8 cm depth) and CT by a plough (to 25 cm depth). Surface soils (0-5 cm) and subsoils (10-20 cm) of two sampling dates (after fallow and directly after tillage) were investigated for concentrations of organic C (Corg) and total N (N), different water-stable aggregate size classes, different density fractions (for the sampling date after fallow only), microbial biomass, and for biochemically stabilized Corg and N (by acid hydrolysis; for the sampling date after tillage only). In addition, two laboratory incubations were performed under controlled conditions: Firstly, MT and CT soils were incubated (28 days at 22°C) as bulk soil and with destroyed macroaggregates in order to estimate the importance of macroaggregates for the physical protection of the very labile OM against mineralization. Secondly, in a microcosm experiment simulating MT and CT systems with soil <250 µm and with 15N and 13C labelled maize straw incorporated to different depths, the mineralization, the formation of new macroaggregates, and the partitioning of the recently added C and N were followed (28 days at 15°C). Forty years of MT regime led to higher concentrations of microbial biomass and of Corg and N compared to CT, especially in the surface soil. After fallow and directly after tillage, a higher proportion of water-stable macroaggregates rich in OM was found in the MT (36% and 66%, respectively) than in the CT (19% and 47%, respectively) surface soils of both sites (data shown are of the site Garte-Süd only). The subsoils followed the same trend. For the sampling date after fallow, no differences in the POM fractions were found but there was more OM associated to the mineral fraction detected in the MT soils. A large temporal variability was observed for the abundance of macroaggregates. In the field and in the microcosm simulations, macroaggregates were found to have a higher formation rate after the incorporation of residues under MT than under CT. Thus, the lower occurrence of macroaggregates in CT soils cannot be attributed to a higher disruption but to a lower formation rate. A higher rate of macroaggregate formation in MT soils may be due to (i) the higher concentrated input of residues in the surface soil and/or (ii) a higher abundance of fungal biomass in contrast to CT soils. Overall, as a location of storage of the surplus of OM detected under MT compared to CT, water-stable macroaggregates were found to play a key role. In the incubation experiment, macroaggregates were not found to protect the very labile OM against mineralization. Anyway, the surplus of OM detected after tillage in the MT soil was biochemically degradable. MT simulations in the microcosm experiment showed a lower specific respiration and a less efficient translocation of recently added residues than the CT simulations. Differences in the early processes of OM translocation between CT and MT simulations were attributed to a higher residue to soil ratio and to a higher proportion of fungal biomass in the MT simulations. Overall, MT was found to have several beneficial effects on the soil structure and on the storage of OM, especially in the surface soil. Furthermore, it was concluded that the high concentration of residues in the surface soil of MT may alter the processes of storage and decomposition of OM. In further investigations, especially analysis of the residue-soil-interface and of effects of the depth of residue incorporation should be emphasised. Moreover, further evidence is needed on differences in the microbial community between CT and MT soils.
Resumo:
Metabolic disorders are a key problem in the transition period of dairy cows and often appear before the onset of further health problems. They mainly derive from difficulties the animals have in adapting to changes and disturbances occurring both outside and inside the organisms and due to varying gaps between nutrient supply and demand. Adaptation is a functional and target-oriented process involving the whole organism and thus cannot be narrowed down to single factors. Most problems which challenge the organisms can be solved in a number of different ways. To understand the mechanisms of adaptation, the interconnectedness of variables and the nutrient flow within a metabolic network need to be considered. Metabolic disorders indicate an overstressed ability to balance input, partitioning and output variables. Dairy cows will more easily succeed in adapting and in avoiding dysfunctional processes in the transition period when the gap between nutrient and energy demands and their supply is restricted. Dairy farms vary widely in relation to the living conditions of the animals. The complexity of nutritional and metabolic processes Animals 2015, 5 979 and their large variations on various scales contradict any attempts to predict the outcome of animals’ adaptation in a farm specific situation. Any attempts to reduce the prevalence of metabolic disorders and associated production diseases should rely on continuous and comprehensive monitoring with appropriate indicators on the farm level. Furthermore, low levels of disorders and diseases should be seen as a further significant goal which carries weight in addition to productivity goals. In the long run, low disease levels can only be expected when farmers realize that they can gain a competitive advantage over competitors with higher levels of disease.
Resumo:
The combined effects of shoot pruning (one or two stems) and inflorescence thinning (five or ten flowers per inflorescence) on greenhouse tomato yield and fruit quality were studied during the dry season (DS) and rainy season (RS) in Central Thailand. Poor fruit set, development of undersized (mostly parthenocarpic) fruits, as well as the physiological disorders blossom-end rot (BER) and fruit cracking (FC) turned out to be the prevailing causes deteriorating fruit yield and quality. The proportion of marketable fruits was less than 10% in the RS and around 65% in the DS. In both seasons, total yield was significantly increased when plants were cultivated with two stems, resulting in higher marketable yields only in the DS. While the fraction of undersized fruits was increased in both seasons when plants were grown with a secondary stem, the proportions of BER and FC were significantly reduced. Restricting the number of flowers per inflorescence invariably resulted in reduced total yield. However, in neither season did fruit load considerably affect quantity or proportion of the marketable yield fraction. Inflorescence thinning tended to promote BER and FC, an effect which was only significant for BER in the RS. In conclusion, for greenhouse tomato production under climate conditions as they are prevalent in Central Thailand, the cultivation with two stems appears to be highly recommendable whereas the measures to control fruit load tested in this study did not proof to be advisable.
Resumo:
In den letzten Jahrzehnten haben sich makroskalige hydrologische Modelle als wichtige Werkzeuge etabliert um den Zustand der globalen erneuerbaren Süßwasserressourcen flächendeckend bewerten können. Sie werden heutzutage eingesetzt um eine große Bandbreite wissenschaftlicher Fragestellungen zu beantworten, insbesondere hinsichtlich der Auswirkungen anthropogener Einflüsse auf das natürliche Abflussregime oder der Auswirkungen des globalen Wandels und Klimawandels auf die Ressource Wasser. Diese Auswirkungen lassen sich durch verschiedenste wasserbezogene Kenngrößen abschätzen, wie z.B. erneuerbare (Grund-)Wasserressourcen, Hochwasserrisiko, Dürren, Wasserstress und Wasserknappheit. Die Weiterentwicklung makroskaliger hydrologischer Modelle wurde insbesondere durch stetig steigende Rechenkapazitäten begünstigt, aber auch durch die zunehmende Verfügbarkeit von Fernerkundungsdaten und abgeleiteten Datenprodukten, die genutzt werden können, um die Modelle anzutreiben und zu verbessern. Wie alle makro- bis globalskaligen Modellierungsansätze unterliegen makroskalige hydrologische Simulationen erheblichen Unsicherheiten, die (i) auf räumliche Eingabedatensätze, wie z.B. meteorologische Größen oder Landoberflächenparameter, und (ii) im Besonderen auf die (oftmals) vereinfachte Abbildung physikalischer Prozesse im Modell zurückzuführen sind. Angesichts dieser Unsicherheiten ist es unabdingbar, die tatsächliche Anwendbarkeit und Prognosefähigkeit der Modelle unter diversen klimatischen und physiographischen Bedingungen zu überprüfen. Bisher wurden die meisten Evaluierungsstudien jedoch lediglich in wenigen, großen Flusseinzugsgebieten durchgeführt oder fokussierten auf kontinentalen Wasserflüssen. Dies steht im Kontrast zu vielen Anwendungsstudien, deren Analysen und Aussagen auf simulierten Zustandsgrößen und Flüssen in deutlich feinerer räumlicher Auflösung (Gridzelle) basieren. Den Kern der Dissertation bildet eine umfangreiche Evaluierung der generellen Anwendbarkeit des globalen hydrologischen Modells WaterGAP3 für die Simulation von monatlichen Abflussregimen und Niedrig- und Hochwasserabflüssen auf Basis von mehr als 2400 Durchflussmessreihen für den Zeitraum 1958-2010. Die betrachteten Flusseinzugsgebiete repräsentieren ein breites Spektrum klimatischer und physiographischer Bedingungen, die Einzugsgebietsgröße reicht von 3000 bis zu mehreren Millionen Quadratkilometern. Die Modellevaluierung hat dabei zwei Zielsetzungen: Erstens soll die erzielte Modellgüte als Bezugswert dienen gegen den jegliche weiteren Modellverbesserungen verglichen werden können. Zweitens soll eine Methode zur diagnostischen Modellevaluierung entwickelt und getestet werden, die eindeutige Ansatzpunkte zur Modellverbesserung aufzeigen soll, falls die Modellgüte unzureichend ist. Hierzu werden komplementäre Modellgütemaße mit neun Gebietsparametern verknüpft, welche die klimatischen und physiographischen Bedingungen sowie den Grad anthropogener Beeinflussung in den einzelnen Einzugsgebieten quantifizieren. WaterGAP3 erzielt eine mittlere bis hohe Modellgüte für die Simulation von sowohl monatlichen Abflussregimen als auch Niedrig- und Hochwasserabflüssen, jedoch sind für alle betrachteten Modellgütemaße deutliche räumliche Muster erkennbar. Von den neun betrachteten Gebietseigenschaften weisen insbesondere der Ariditätsgrad und die mittlere Gebietsneigung einen starken Einfluss auf die Modellgüte auf. Das Modell tendiert zur Überschätzung des jährlichen Abflussvolumens mit steigender Aridität. Dieses Verhalten ist charakteristisch für makroskalige hydrologische Modelle und ist auf die unzureichende Abbildung von Prozessen der Abflussbildung und –konzentration in wasserlimitierten Gebieten zurückzuführen. In steilen Einzugsgebieten wird eine geringe Modellgüte hinsichtlich der Abbildung von monatlicher Abflussvariabilität und zeitlicher Dynamik festgestellt, die sich auch in der Güte der Niedrig- und Hochwassersimulation widerspiegelt. Diese Beobachtung weist auf notwendige Modellverbesserungen in Bezug auf (i) die Aufteilung des Gesamtabflusses in schnelle und verzögerte Abflusskomponente und (ii) die Berechnung der Fließgeschwindigkeit im Gerinne hin. Die im Rahmen der Dissertation entwickelte Methode zur diagnostischen Modellevaluierung durch Verknüpfung von komplementären Modellgütemaßen und Einzugsgebietseigenschaften wurde exemplarisch am Beispiel des WaterGAP3 Modells erprobt. Die Methode hat sich als effizientes Werkzeug erwiesen, um räumliche Muster in der Modellgüte zu erklären und Defizite in der Modellstruktur zu identifizieren. Die entwickelte Methode ist generell für jedes hydrologische Modell anwendbar. Sie ist jedoch insbesondere für makroskalige Modelle und multi-basin Studien relevant, da sie das Fehlen von feldspezifischen Kenntnissen und gezielten Messkampagnen, auf die üblicherweise in der Einzugsgebietsmodellierung zurückgegriffen wird, teilweise ausgleichen kann.
Resumo:
In 2000 the European Statistical Office published the guidelines for developing the Harmonized European Time Use Surveys system. Under such a unified framework, the first Time Use Survey of national scope was conducted in Spain during 2002– 03. The aim of these surveys is to understand human behavior and the lifestyle of people. Time allocation data are of compositional nature in origin, that is, they are subject to non-negativity and constant-sum constraints. Thus, standard multivariate techniques cannot be directly applied to analyze them. The goal of this work is to identify homogeneous Spanish Autonomous Communities with regard to the typical activity pattern of their respective populations. To this end, fuzzy clustering approach is followed. Rather than the hard partitioning of classical clustering, where objects are allocated to only a single group, fuzzy method identify overlapping groups of objects by allowing them to belong to more than one group. Concretely, the probabilistic fuzzy c-means algorithm is conveniently adapted to deal with the Spanish Time Use Survey microdata. As a result, a map distinguishing Autonomous Communities with similar activity pattern is drawn. Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance
Resumo:
La obesidad es un problema de salud global siendo la cirugía bariatrica el mejor tratamiento demostrado. El Bypass gástrico (BGYR) es el método más utilizado que combina restricción y malabsorcion; sin embargo los procedimientos restrictivos se han popularizado recientemente. La Gastro-gastroplastia produce restricción gástrica reversible por medio de un pouch gástrico con anastomosis gastrogástrica y propusimos su evaluación Métodos: Estudio retrospectivo no randomizado que evaluó archivos de pacientes con GG y BGYR laparoscópicos entre febrero de 2008 y Abril de 2011 Resultados: 289 pacientes identificados: 180 GG y 109 BGYR de los cuales 138 cumplieron criterios de inclusión, 77 (55.8%) GG y 61 (44,2%) BGYR, 18 (13%) hombres y 120 (87%) mujeres. Para GG la mediana del peso inicial fue 97,15 (± 17,3) kg, IMC inicial de 39,35 (± 3,38) kg/m2 y exceso de peso de 37,1 (±11,9). La mediana de IMC a los 1, 6 y 12 meses fue 34,8 (±3,58) kg/m2, 30,81 (±3,81) kg/m2, 29,58 (±4,25) kg/m2 respectivamente. La mediana de % PEP 1, 6 y 12 meses fue 30,9 (±14,2) %, 61,88 (±18,27) %, 68,4 (±19,64) % respectivamente. Para BGYR la mediana del peso inicial fue 108,1 (± 25,4) kg, IMC inicial 44,4 (± 8,1) y exceso de peso de 48,4 (±15,2) %. La mediana de IMC a los 1, 6 y 12 meses fue 39 (±7,5) kg/m2, 33,31 (±4,9) kg/m2, 30,9 (±4,8) kg/m2 respectivamente. La mediana de % PEP 1, 6 y 12 meses fue 25,9 (±12,9) %, 61,87 (±18,62) %, 71,41 (±21,09) % respectivamente. Seguimiento a un año Conclusiones: La gastro-gastroplastia se plantea como técnica restrictiva, reversible, con resultados óptimos en reducción de peso y alternativa quirúrgica en pacientes con obesidad. Son necesarios estudios a mayor plazo para demostrar mantenimiento de cambios en el tiempo
Resumo:
La obesidad es un problema de salud global siendo la cirugía bariatrica el mejor tratamiento demostrado. El Bypass Gástrico (BGYR) es el método más utilizado que combina restricción y malabsorcion; sin embargo los procedimientos restrictivos se han popularizado recientemente. La Gastro-gastroplastia produce restricción gástrica reversible por medio de un pouch gástrico con anastomosis gastrogástrica y propusimos su evaluación Métodos: Estudio retrospectivo no randomizado que evaluó archivos de pacientes con GG y BGYR laparoscópicos entre Febrero de 2008 y Abril de 2011 Resultados: 289 pacientes identificados: 180 GG y 109 BGYR de los cuales 138 cumplieron criterios de inclusión, 77 (55.8%) GG y 61 (44,2%) BGYR, 18 (13%) hombres y 120 (87%) mujeres. Para GG la mediana del peso inicial fue 97,15 (± 17,3) kg, IMC inicial de 39,35 (± 3,38) kg/m2 y exceso de peso de 37,1 (±11,9). La mediana de IMC a los 1, 6 y 12 meses fue 34,8 (±3,58) kg/m2, 30,81 (±3,81) kg/m2, 29,58 (±4,25) kg/m2 respectivamente. La mediana de % PEP 1, 6 y 12 meses fue 30,9 (±14,2) %, 61,88 (±18,27) %, 68,4 (±19,64) % respectivamente. Para BGYR la mediana del peso inicial fue 108,1 (± 25,4) kg, IMC inicial 44,4 (± 8,1) y exceso de peso de 48,4 (±15,2) %. La mediana de IMC a los 1, 6 y 12 meses fue 39 (±7,5) kg/m2, 33,31 (±4,9) kg/m2, 30,9 (±4,8) kg/m2 respectivamente. La mediana de % PEP 1, 6 y 12 meses fue 25,9 (±12,9) %, 61,87 (±18,62) %, 71,41 (±21,09) % respectivamente. Seguimiento a un año. Conclusiones: La gastro-gastroplastia se plantea como técnica restrictiva, reversible, con resultados óptimos en reducción de peso y alternativa quirúrgica en pacientes con obesidad. Son necesarios estudios a mayor plazo para demostrar mantenimiento de cambios en el tiempo.
Resumo:
This presentation discusses the role and purpose of testing in the systems/Software Development Life Cycle. We examine the consequences of the 'cost curve' on defect removal and how agile methods can reduce its effects. We concentrate on Black Box Testing and use Equivalence Partitioning and Boundary Value Analysis to construct the smallest number of test cases, test scenarios necessary for a test plan.
Resumo:
La esclerosis sistémica (ES) es una enfermedad autoinmune multisistémica que afecta principalmente la piel, los pulmones, el tracto gastrointestinal, el corazón y los riñones. La enfermedad pulmonar, presente en casi el 100% de los casos, es el factor con mayor influencia en la mortalidad. El propósito de este estudio es realizar un análisis detallado de la enfermedad pulmonar por tomografía computarizada de alta resolución(TCAR) en pacientes Colombianos con ES, para lo cual se realizó un estudio de prevalencia analítica en 44 pacientes con ES valorados en el Hospital Universitario Mayor Méderi en los últimos 7 años. Los resultados mostraron características demográficas y clínicas similares a las previamente descritas. La prevalencia de enfermedad pulmonar intersticial fue alta, y los hallazgos de fibrosis pulmonar como vidrio esmerilado y panal de abejas se asociaron con la presencia del autoanticuerpo antiSCL70. La medida del diámetro esofágico por TCAR fue mayor en los pacientes con disfagia, antiSCL 70 y linfopenia, los cuales son marcadores de mal pronóstico.