960 resultados para Graph partitioning
Resumo:
We describe a direct method of partitioning the 840 Steiner triple systems of order 9 into 120 large sets. The method produces partitions in which all of the large sets are isomorphic and we apply the method to each of the two non-isomorphic large sets of STS(9).
Resumo:
Purpose. As reductions in dermal clearance increase the residence time of solutes in the skin and underlying tissues we compared the topical penetration of potentially useful vasoconstrictors (VCs) through human epidermis as both free bases and ion-pairs with salicylic acid (SA). Methods. We determined the in vitro epidermal flux of ephedrine, naphazoline, oxymetazoline, phenylephrine, and xylometazoline applied as saturated solutions in propylene glycol: water (1: 1) and of ephedrine, naphazoline and tetrahydrozoline as 10% solutions of 1: 1 molar ratio ion-pairs with SA in liquid paraffin. Results. As free bases, ephedrine had the highest maximal flux, Jmax = 77.4 +/- 11.7 mug/cm(2)/h, being 4-fold higher than tetrahydrozoline and xylometazoline, 6-fold higher than phenylephrine, 10-fold higher than naphazoline and 100-fold higher than oxymetazoline. Stepwise regression of solute physicochemical properties identified melting point as the most significant predictor of flux. As ion-pairs with SA, ephedrine and naphazoline had similar fluxes (11.5 +/- 2.3 and 12.0 +/- 1.6 mug/cm(2)/h respectively), whereas tetrahydrozoline was approximately 3-fold slower. Corresponding fluxes of SA from the ion-pairs were 18.6 +/- 0.6, 7.8 +/- 0.8 and 1.1 +/- 0.1 respectively. Transdermal transport of VC's is discussed. Conclusions. Epidermal retention of VCs and SA did not correspond to their molar ratio on application and confirmed that following partitioning into the stratum corneum, ion-pairs separate and further penetration is governed by individual solute characteristics.
Unexpected clobetasol propionate profile in human stratum corneum after topical application in vitro
Resumo:
Purpose. The validity of using drug amount-depth profiles in stratum corneum to predict uptake of clobetasol propionate into stratum corneum and its transport into deeper skin layers was investigated. Methods. In vitro diffusion experiments through human epidermis were carried out using Franz-type glass diffusion cells. A saturated solution of clobetasol propionate in 20% (V/V) aqueous propylene glycol was topically applied for 48 h. Steady state flux was calculated from the cumulative amount of drug permeated vs. time profile. Epidermal partitioning was conducted by applying a saturated drug solution to both sides of the epidermis and allowing time to equilibrate. The tape stripping technique was used to define drug concentration-depth profiles in stratum corneum for both the diffusion and equilibrium experiments. Results. The concentration-depth profile of clobetasol propionate in stratum corneum for the diffusion experiment is biphasic. A logarithmic decline of the drug concentration over the first four to five tape strips flattens to a relatively constant low concentration level in deeper layers. The drug concentration-depth profile for the equilibrium studies displays a similar shape. Conclusions. The shape of the concentration-depth profile of clobetasol propionate is mainly because of the variable partitioning coefficient in different stratum corneum layers.
Resumo:
In this paper we present a technique for visualising hierarchical and symmetric, multimodal fitness functions that have been investigated in the evolutionary computation literature. The focus of this technique is on landscapes in moderate-dimensional, binary spaces (i.e., fitness functions defined over {0, 1}(n), for n less than or equal to 16). The visualisation approach involves an unfolding of the hyperspace into a two-dimensional graph, whose layout represents the topology of the space using a recursive relationship, and whose shading defines the shape of the cost surface defined on the space. Using this technique we present case-study explorations of three fitness functions: royal road, hierarchical-if-and-only-if (H-IFF), and hierarchically decomposable functions (HDF). The visualisation approach provides an insight into the properties of these functions, particularly with respect to the size and shape of the basins of attraction around each of the local optima.
Resumo:
Graphical user interfaces (GUIs) are critical components of todays software. Given their increased relevance, correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing systems. We use static analysis techniques to generate models of the user interface behaviour from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particularly type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs
Resumo:
A large and growing amount of software systems rely on non-trivial coordination logic for making use of third party services or components. Therefore, it is of outmost importance to understand and capture rigorously this continuously growing layer of coordination as this will make easier not only the veri cation of such systems with respect to their original speci cations, but also maintenance, further development, testing, deployment and integration. This paper introduces a method based on several program analysis techniques (namely, dependence graphs, program slicing, and graph pattern analysis) to extract coordination logic from legacy systems source code. This process is driven by a series of pre-de ned coordination patterns and captured by a special purpose graph structure from which coordination speci cations can be generated in a number of di erent formalisms
Resumo:
Current software development often relies on non-trivial coordination logic for combining autonomous services, eventually running on different platforms. As a rule, however, such a coordination layer is strongly woven within the application at source code level. Therefore, its precise identification becomes a major methodological (and technical) problem and a challenge to any program understanding or refactoring process. The approach introduced in this paper resorts to slicing techniques to extract coordination data from source code. Such data are captured in a specific dependency graph structure from which a coordination model can be recovered either in the form of an Orc specification or as a collection of code fragments corresponding to the identification of typical coordination patterns in the system. Tool support is also discussed
Resumo:
The integration and composition of software systems requires a good architectural design phase to speed up communications between (remote) components. However, during implementation phase, the code to coordinate such components often ends up mixed in the main business code. This leads to maintenance problems, raising the need for, on the one hand, separating the coordination code from the business code, and on the other hand, providing mechanisms for analysis and comprehension of the architectural decisions once made. In this context our aim is at developing a domain-specific language, CoordL, to describe typical coordination patterns. From our point of view, coordination patterns are abstractions, in a graph form, over the composition of coordination statements from the system code. These patterns would allow us to identify, by means of pattern-based graph search strategies, the code responsible for the coordination of the several components in a system. The recovering and separation of the architectural decisions for a better comprehension of the software is the main purpose of this pattern language
Resumo:
What sort of component coordination strategies emerge in a software integration process? How can such strategies be discovered and further analysed? How close are they to the coordination component of the envisaged architectural model which was supposed to guide the integration process? This paper introduces a framework in which such questions can be discussed and illustrates its use by describing part of a real case-study. The approach is based on a methodology which enables semi-automatic discovery of coordination patterns from source code, combining generalized slicing techniques and graph manipulation
Resumo:
Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually targeting either the imperative or the object oriented paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird- Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general alternative to slicing functional programs
Resumo:
Graphical user interfaces (GUIs) are critical components of today's open source software. Given their increased relevance, the correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing open source systems. We use static analysis techniques to generate models of the user interface behavior from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particular type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
ABSTRACT Sorghum arundinaceum (Desv.) Stapf is a weed that belongs to the Poaceae family and is widespread throughout Brazil. Despite the frequent occurrence, infesting cultivated areas, there is little research concerning the biology and physiology of this species. The objective of this research was to evaluate the growth, carbon partitioning and physiological characteristics of the weed Sorghum arundinaceum in greenhouse. Plants were collected at regular intervals of seven days, from 22 to 113 days after transplanting (DAT). In each sample, we determined plant height, root volume, leaf area and dry matter, and subsequently we perfomed the growth analysis, we have determined the dry matter partitioning among organs, the accumulation of dry matter, the specific leaf area, the relative growth rate and leaf weight ratio. At 36, 78 and 113 DAT, the photosynthetic and transpiration rates, stomatal conductance, CO2 concentration and chlorophyll fluorescence were evaluated. The Sorghum arundinaceum reached 1.91 in height, with slow initial growth and allocated much of the biomass in the roots. The photosynthetic rate and the maximum quantum yield of FSII are similar throughout the growth cycle. At maturity the Sorghum arundinaceum presents higher values of transpiration rate, stomatal conductance and non-photochemical quenching coefficient (NPQ).
Resumo:
RESUMO: A utilização adequada das TIC no ensino da Matemática, nos dias de hoje é considerada por alguns como justificada e inevitável, esperando que a sua utilização melhore o ensino e a aprendizagem da Matemática. Nesta investigação, pretende-se testar o Software Winplot), no ensino e aprendizagem do gráfico da função quadrática com alunos do 10ºano, da Escola do segundo ciclo do Ensino Secundário nº9099, de modo a verificar se melhora o ensino e na aprendizagem desta temática.Para a nossa investigação Seleccionámos dois grupos de alunos do 10º ano que funcionaram como grupo de controlo e grupo experimental; depois de ambos os grupos terem realizado dois pré-testes, o grupo experimental realizou as aprendizagens no laboratório de informática com auxílio do Software Winplot, ao longo de 8 semanas, durante o 2º trimestre do ano lectivo de 2009/2010. O grupo de controlo realizou as aprendizagens, ao mesmo tempo que o grupo experimental, na sala normal de aulas sem auxílio do Software Winplot.Ao compararmos os dois grupos, o teste T de pares para amostras independentes, mostra-nos que estatisticamente não há diferenças significativas entre os dois grupos, porque os níveis de significância são maiores que p=0,05, desta feita podemos dizer que o grupo experimental, não obteve melhores resultados que o grupo de controlo, logo o Software Winplot não resultou o efeito desejado nas aprendizagens com alunos da 10ºano da Escola do segundo ciclo do ensino Secundário nº9099, sita no município de Viana (Luanda/Angola). ABSTRACT:The appropriate use of ICTs in teaching mathematics, today is considered by somo to be justified and inevitable, hoping that their use will improve the teaching and learning of mathematics.In this investigation, we intend to test the Software Winplot, teaching and learning of the graph of quadratic functions with students of grade 10, attending the second cycle of secondary School nº9099 in order to verify that improves teaching and learning of this subject.For our research selected two groups of students in 10th grade who acted as the controlo group and experimental group, after both group had undergone two pre-test, the experimental group performed the learning in the computer lab with the aid of Software Winplot, over 8 weeks during the second quarter of the academic year 2009/2010. Thr control gropu performed the learning, while the experimental group, in rregular class room without help of the Software Winplot.Comparing the two groups, the t test for independent samples pairs, shows us that there is no statistically significant differences between the two groups, because the significance levels are greater than p=0,05, this time we can say that experimental group, not yielded better results than the control group, so the Software did not result the desired effect on the learning with students from 10th grade of the School of the second cycle of Secondary nº9099, located in Viana (Luanda/Angola).
Resumo:
We provide all agent; the capability to infer the relations (assertions) entailed by the rules that, describe the formal semantics of art RDFS knowledge-base. The proposed inferencing process formulates each semantic restriction as a rule implemented within a, SPARQL query statement. The process expands the original RDF graph into a fuller graph that. explicitly captures the rule's described semantics. The approach is currently being explored in order to support descriptions that follow the generic Semantic Web Rule Language. An experiment, using the Fire-Brigade domain, a small-scale knowledge-base, is adopted to illustrate the agent modeling method and the inferencing process.