899 resultados para Numeric sets
Resumo:
International audience
Resumo:
The classification of minimal sets is a central theme in abstract topological dynamics. Recently this work has been strengthened and extended by consideration of homomorphisms. Background material is presented in Chapter I. Given a flow on a compact Hausdorff space, the action extends naturally to the space of closed subsets, taken with the Hausdorff topology. These hyperspaces are discussed and used to give a new characterization of almost periodic homomorphisms. Regular minimal sets may be described as minimal subsets of enveloping semigroups. Regular homomorphisms are defined in Chapter II by extending this notion to homomorphisms with minimal range. Several characterizations are obtained. In Chapter III, some additional results on homomorphisms are obtained by relativizing enveloping semigroup notions. In Veech's paper on point distal flows, hyperspaces are used to associate an almost one-to-one homomorphism with a given homomorphism of metric minimal sets. In Chapter IV, a non-metric generalization of this construction is studied in detail using the new notion of a highly proximal homomorphism. An abstract characterization is obtained, involving only the abstract properties of homomorphisms. A strengthened version of the Veech Structure Theorem for point distal flows is proved. In Chapter V, the work in the earlier chapters is applied to the study of homomorphisms for which the almost periodic elements of the associated hyperspace are all finite. In the metric case, this is equivalent to having at least one fiber finite. Strong results are obtained by first assuming regularity, and then assuming that the relative proximal relation is closed as well.
Resumo:
The thesis is concerned with a number of problems in Combinatorial Set Theory. The Generalized Continuum Hypothesis is assumed. Suppose X and K are non-zero cardinals. By successively identifying K with airwise disjoint sets of power K, a function/: X-*•K can be viewed as a transversal of a pairwise disjoint (X, K)family A . Questions about families of functions in K can thus bethought of as referring to families of transversals of A. We wish to consider generalizations of such questions to almost disjoint families; in particular we are interested in extensions of the following two problems: (i) What is the 'maximum' cardinality of an almost disjoint family of functions each mapping X into K? (ii) Describe the cardinalities of maximal almost disjoint families of functions each mapping X into K. Article in Bulletin of the Australian Mathematical Society 27(03):477 - 479 · June 1983
Resumo:
Pesticides applications have been described by many researches as a very inefficient process. In some cases, there are reports that only 0.02% of the applied products are used for the effective control of the problem. The main factor that influences pesticides applications is the droplet size formed on spraying nozzles. Many parameters affects the dynamic of the droplets, like wind, temperature, relative humidity, and others. Small droplets are biologically more active, but they are affected by evaporation and drift. On the other hand, the great droplets do not promote a good distribution of the product on the target. In this sense, associated with the risk of non target areas contamination and with the high costs involved in applications, the knowledge of the droplet size is of fundamental importance in the application technology. When sophisticated technology for droplets analysis is unavailable, is common the use of artificial targets like water-sensitive paper to sample droplets. On field sampling, water-sensitive papers are placed on the trials where product will be applied. When droplets impinging on it, the yellow surface of this paper will be stained dark blue, making easy their recognition. Collected droplets on this papers have different kinds of sizes. In this sense, the determination of the droplet size distribution gives a mass distribution of the material and so, the efficience of the application of the product. The stains produced by droplets shows a spread factor proportional to their respectives initial sizes. One of methodologies to analyse the droplets is a counting and measure of the droplets made in microscope. The Porton N-G12 graticule, that shows equaly spaces class intervals on geometric progression of square 2, are coulpled to the lens of the microscope. The droplet size parameters frequently used are the Volumetric Median Diameter (VMD) and the Numeric Median Diameter. On VMD value, a representative droplets sample is divided in two equal parts of volume, in such away one part contains droplets of sizes smaller than VMD and the other part contains droplets of sizes greater that VMD. The same process is done to obtaining the NMD, which divide the sample in two equal parts in relation to the droplets size. The ratio between VMD and NMD allows the droplets uniformity evaluation. After that, the graphics of accumulated probability of the volume and size droplets are plotted on log scale paper (accumulated probability versus median diameter of each size class). The graphics provides the NMD on the x-axes point corresponding to the value of 50% founded on the y-axes. All this process is very slow and subjected to operator error. So, in order to decrease the difficulty envolved with droplets measuring it was developed a numeric model, implemented on easy and accessfull computational language, which allows approximate VMD and NMD values, with good precision. The inputs to this model are the frequences of the droplets sizes colected on the water-sensitive paper, observed on the Porton N-G12 graticule fitted on microscope. With these data, the accumulated distribution of the droplet medium volumes and sizes are evaluated. The graphics obtained by plotting this distributions allow to obtain the VMD and NMD using linear interpolation, seen that on the middle of the distributions the shape of the curves are linear. These values are essential to evaluate the uniformity of droplets and to estimate the volume deposited on the observed paper by the density (droplets/cm2). This methodology to estimate the droplets volume was developed by 11.0.94.224 Project of the CNPMA/EMBRAPA. Observed data of herbicides aerial spraying samples, realized by Project on Pelotas/RS county, were used to compare values obtained manual graphic method and with those obtained by model has shown, with great precision, the values of VMD and NMD on each sampled collector, allowing to estimate a quantities of deposited product and, by consequence, the quantities losses by drifty. The graphics of variability of VMD and NMD showed that the quantity of droplets that reachs the collectors had a short dispersion, while the deposited volume shows a great interval of variation, probably because the strong action of air turbulence on the droplets distribution, enfasizing the necessity of a deeper study to verify this influences on drift.
Resumo:
We generalize the classical notion of Vapnik–Chernovenkis (VC) dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive Inference. A logical learning paradigm is defined as a set W of structures over some vocabulary, and a set D of first-order formulas that represent data. The sets of models of ϕ in W, where ϕ varies over D, generate a natural topology W over W. We show that if D is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of D in a member of W, with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity—a variation on predictive complexity—and mind change complexity. The assumptions that D is closed under boolean operators and that W is compact often play a crucial role to establish connections between these concepts. We then consider a computable setting with effective versions of the complexity measures, and show that the equivalence between ordinal VC-dimension and predictive complexity fails. More precisely, we prove that the effective ordinal VC-dimension of a paradigm can be defined when all other effective notions of complexity are undefined. On a better note, when W is compact, all effective notions of complexity are defined, though they are not related as in the noncomputable version of the framework.
Resumo:
The primary aims of scoliosis surgery are to halt the progression of the deformity, and to reduce its severity (cosmesis). Currently, deformity correction is measured in terms of posterior parameters (Cobb angles and rib hump), even though the cosmetic concern for most patients is anterior chest wall deformity. In this study, we propose a new measure for assessing anterior chest wall deformity and examine the correlation between rib hump and the new measure. 22 sets of CT scans were retrieved from the QUT/Mater Paediatric Spinal Research Database. The Image J software (NIH) was used to manipulate formatted CT scans into 3-dimensional anterior chest wall reconstructions. A ‘chest wall angle’ was then measured in relation to the first sacral vertebral body. The chest wall angle was found to be a reliable tool in the analysis of chest wall deformity. No correlation was found between the new measure and rib hump angle. Since rib hump has been shown to correlate with vertebral rotation on CT, this suggests that there maybe no correlation between anterior and posterior deformity measures. While most surgical procedures will adequately address the coronal imbalance & posterior rib hump elements of scoliosis, they do not reliably alter the anterior chest wall shape. This implies that anterior chest wall deformity is to a large degree an intrinsic deformity, not directly related to vertebral rotation.
Error, Bias, and Long-Branch Attraction in Data for Two Chloroplast Photosystem Genes in Seed Plants
Resumo:
Sequences of two chloroplast photosystem genes, psaA and psbB, together comprising about 3,500 bp, were obtained for all five major groups of extant seed plants and several outgroups among other vascular plants. Strongly supported, but significantly conflicting, phylogenetic signals were obtained in parsimony analyses from partitions of the data into first and second codon positions versus third positions. In the former, both genes agreed on a monophyletic gymnosperms, with Gnetales closely related to certain conifers. In the latter, Gnetales are inferred to be the sister group of all other seed plants, with gymnosperms paraphyletic. None of the data supported the modern ‘‘anthophyte hypothesis,’’ which places Gnetales as the sister group of flowering plants. A series of simulation studies were undertaken to examine the error rate for parsimony inference. Three kinds of errors were examined: random error, systematic bias (both properties of finite data sets), and statistical inconsistency owing to long-branch attraction (an asymptotic property). Parsimony reconstructions were extremely biased for third-position data for psbB. Regardless of the true underlying tree, a tree in which Gnetales are sister to all other seed plants was likely to be reconstructed for these data. None of the combinations of genes or partitions permits the anthophyte tree to be reconstructed with high probability. Simulations of progressively larger data sets indicate the existence of long-branch attraction (statistical inconsistency) for third-position psbB data if either the anthophyte tree or the gymnosperm tree is correct. This is also true for the anthophyte tree using either psaA third positions or psbB first and second positions. A factor contributing to bias and inconsistency is extremely short branches at the base of the seed plant radiation, coupled with extremely high rates in Gnetales and nonseed plant outgroups. M. J. Sanderson,* M. F. Wojciechowski,*† J.-M. Hu,* T. Sher Khan,* and S. G. Brady
Resumo:
Objectives. Considerable evidence suggests that enforcement efforts cannot fully explain the high degree of tax compliance. To resolve this puzzle of tax compliance, several researchers have argued that citizens' attitudes toward paying taxes, defined as tax morale, helps to explain the high degree of tax compliance. However, most studies have treated tax morale as a black box, without discussing which factors shape it. Additionally, the tax compliance literature provides little empirical research that investigates attitudes toward paying taxes in Europe. Methods. Thus, this article is unique in its examination of citizen tax morale within three multicultural European countries, Switzerland, Belgium, and Spain, a choice that allows far more detailed examination of the impact of culture and institutions using data sets from the World Values Survey and the European Values Survey. Results. The results indicate the tendency that cultural and regional differences affect tax morale. Conclusion. The findings suggest that higher legitimacy for political institutions leads to higher tax morale.
Resumo:
Jacques Ranciere's work on aesthetics has received a great deal of attention recently. Given his work has enormous range – taking in art and literature, political theory, historiography, pedagogy and worker's history – Andrew McNamara and Toni Ross (UNSW) seek to explore his wider project in this interview, while showing how it leads to his alternative insights into aesthetics. Rancière sets aside the core suppositions linking the medium to aesthetic judgment, which has informed many definitions of modernism. Rancière is emphatic in freeing aesthetic judgment from issues of medium-specificity. He argues that the idea of autonomy associated with medium-specificity – or 'truth to the medium' – was 'a very late one' in modernism, and that post-medium trends were already evident in early modernism. While not stressing a simple continuity between early modernism and contemporary art, Ranciere nonetheless emphasizes the ethical and political ramifications of maintaining an a-disciplinary stance.
Resumo:
Principal topic: Effectuation theory suggests that entrepreneurs develop their new ventures in an iterative way by selecting possibilities through flexibility and interactions with the market; a focus on affordability of loss rather than maximal return on the capital invested, and the development of pre-commitments and alliances from stakeholders (Sarasvathy, 2001, 2008; Sarasvathy et al., 2005, 2006). In contrast, causation may be described as a rationalistic reasoning method to create a company. After a comprehensive market analysis to discover opportunities, the entrepreneur will select the alternative with the higher expected return and implement it through the use of a business plan. However, little is known about the consequences of following either of these two processes. One aspect that remains unclear is the relationship between newness and effectuation. On one hand it can be argued that the combination of a means-centered, interactive (through pre-commitments and alliances with stakeholders from the early phases of the venture creation) and open-minded process (through flexibility of exploiting contingencies) should encourage and facilitate the development of innovative solutions. On the other hand, having a close relationship with their “future first customers” and focussing too much on the resources and knowledge already within the firm may be a constraint that is not conducive to innovation, or at least not to a radical innovation. While it has been suggested that effectuation strategy is more likely to be used by innovative entrepreneurs (Sarasvathy, 2001), this hypothesis has not been demonstrated yet (Sarasvathy, 2001). Method: In our attempt to capture newness in its different aspects we have considered the following four domains where newness may happen: new product/service; new method for promotion and sales; new production methods/sourcing; market creation. We identified how effectuation may be differently associated with these four domains of newness. To test our four sets of hypotheses a dataset of 1329 firms (702 nascent and 627 young firms) randomly selected in Australia was examined through ANOVA Tukey HSD Test. Results and Implications: Results indicate the existence of a curvilinear relationship between effectuation and newness where low and high levels of newness are associated with low level of effectuation while medium level of newness is associated with high level of effectuation. Implications for academia, practitioners and policy makers are also discussed.
Resumo:
In James Rubin's account of the Kosovo war, he describes an exchange between Secretary Albright and Robin Cook (the British Foreign Secretary). Cook was explaining that it is difficult for Britain to commit to the war without UN Security Council approval because the legal advice he had received was that such action would be illegal under international law. Albright's response was, simply, "get new lawyers". Rubin "credits" Blair with a "push" that swung the British to "finally agree" that a UN Security Council resolution was "not legally required". Robin Cook later stated in Parliament and that the war was legal. Interestingly, Blair did not. This article does not look at whether or not such an exchange took place; rather look at the ethical issues that such a situation would generate. The article suggests what the ethical obligations of the key legal players in such institutional dramas should be—including governments seeking advice, the lawyers giving it, the ministers reporting it and the opposition in Parliament. The article sets out the particular responsibilities of the lawyers and officials of a Westminster system. It also sets out some of the institutional mechanisms for making it more likely that those obligations are fulfilled—as always through the interaction of obligations by different players that make it more risky for any player to breach his or her ethical obligations. Analogous duties would be faced by the relevant actors in other systems.