833 resultados para attribute grammars
Resumo:
This report describes research about flow graphs - labeled, directed, acyclic graphs which abstract representations used in a variety of Artificial Intelligence applications. Flow graphs may be derived from flow grammars much as strings may be derived from string grammars; this derivation process forms a useful model for the stepwise refinement processes used in programming and other engineering domains. The central result of this report is a parsing algorithm for flow graphs. Given a flow grammar and a flow graph, the algorithm determines whether the grammar generates the graph and, if so, finds all possible derivations for it. The author has implemented the algorithm in LISP. The intent of this report is to make flow-graph parsing available as an analytic tool for researchers in Artificial Intelligence. The report explores the intuitions behind the parsing algorithm, contains numerous, extensive examples of its behavior, and provides some guidance for those who wish to customize the algorithm to their own uses.
Resumo:
X. Wang, J. Yang, R. Jensen and X. Liu, 'Rough Set Feature Selection and Rule Induction for Prediction of Malignancy Degree in Brain Glioma,' Computer Methods and Programs in Biomedicine, vol. 83, no. 2, pp. 147-156, 2006.
Resumo:
Z. Huang and Q. Shen. Fuzzy interpolative and extrapolative reasoning: a practical approach. IEEE Transactions on Fuzzy Systems, 16(1):13-28, 2008.
Resumo:
Murphy, L. and Thomas, L. 2008. Dangers of a fixed mindset: implications of self-theories research for computer science education. In Proceedings of the 13th Annual Conference on innovation and Technology in Computer Science Education (Madrid, Spain, June 30 - July 02, 2008). ITiCSE '08. ACM, New York, NY, 271-275.
Resumo:
Wydział Nauk Społecznych: Instytut Filozofii
Resumo:
Form-focused instruction is usually based on traditional practical/pedagogical grammar descriptions of grammatical features. The comparison of such traditional accounts with cognitive grammar (CG) descriptions seems to favor CG as a basis of pedagogical rules. This is due to the insistence of CG on the meaningfulness of grammar and its detailed analyses of the meanings of particular grammatical features. The differences between traditional and CG rules/descriptions are exemplified by juxtaposing the two kinds of principles concerning the use of the present simple and present progressive to refer to situations happening or existing at speech time. The descriptions provided the bases for the instructional treatment in a quasi-experimental study exploring the effectiveness of using CG descriptions of the two tenses, and of their interplay with stative (imperfective) and dynamic (perfective) verbs, and comparing this effectiveness with the value of grammar teaching relying on traditional accounts found in standard pedagogical grammars. The study involved 50 participants divided into three groups, with one of them constituting the control group and the other two being experimental ones. One of the latter received treatment based on CG descriptions and the other on traditional accounts. CG-based instruction was found to be at least moderately effective in terms of fostering mostly explicit grammatical knowledge and its effectiveness turned out be comparable to that of teaching based on traditional descriptions.
Resumo:
Wydział Neofilologii: Instytut Lingwistyki Stosowanej
Resumo:
Dissertação de Mestrado apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Relações Públicas.
Resumo:
Trabalho apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Licenciada em Enfermagem
Resumo:
Dissertação apresentada à Universidade Fernando Pessoa como parte dos requisitos para a obtenção do grau de Mestre em Ciências da Comunicação, ramo de Marketing e Publicidade
Resumo:
We present results of calculations [1] that employ a new mixed quantum classical iterative density matrix propagation approach (ILDM , or so called Is‐Landmap) [2] to explore the survival of coherence in different photo synthetic models. Our model studies confirm the long lived quantum coherence , while conventional theoretical tools (such as Redfield equation) fail to describe these phenomenon [3,4]. Our ILDM method is a numerical exactly propagation scheme and can be served as a bench mark calculation tools[2]. Result get from ILDM and from other recent methods have been compared and show agreement with each other[4,5]. Long lived coherence plateau has been attribute to the shift of harmonic potential due to the system bath interaction, and the harvesting efficiency is a balance between the coherence and dissipation[1]. We use this approach to investigate the excitation energy transfer dynamics in various light harvesting complex include Fenna‐Matthews‐Olsen light harvesting complex[1] and Cryptophyte Phycocyanin 645 [6]. [1] P.Huo and D.F.Coker ,J. Chem. Phys. 133, 184108 (2010) . [2] E.R. Dunkel, S. Bonella, and D.F. Coker, J. Chem. Phys. 129, 114106 (2008). [3] A. Ishizaki and G.R. Fleming, J. Chem. Phys. 130, 234111 (2009). [4] A. Ishizaki and G.R. Fleming, Proc. Natl. Acad. Sci. 106, 17255 (2009). [5] G. Tao and W.H. Miller, J. Phys. Chem. Lett. 1, 891 (2010). [6] P.Huo and D.F.Coker in preparation
Resumo:
Recent work has shown the prevalence of small-world phenomena [28] in many networks. Small-world graphs exhibit a high degree of clustering, yet have typically short path lengths between arbitrary vertices. Internet AS-level graphs have been shown to exhibit small-world behaviors [9]. In this paper, we show that both Internet AS-level and router-level graphs exhibit small-world behavior. We attribute such behavior to two possible causes–namely the high variability of vertex degree distributions (which were found to follow approximately a power law [15]) and the preference of vertices to have local connections. We show that both factors contribute with different relative degrees to the small-world behavior of AS-level and router-level topologies. Our findings underscore the inefficacy of the Barabasi-Albert model [6] in explaining the growth process of the Internet, and provide a basis for more promising approaches to the development of Internet topology generators. We present such a generator and show the resemblance of the synthetic graphs it generates to real Internet AS-level and router-level graphs. Using these graphs, we have examined how small-world behaviors affect the scalability of end-system multicast. Our findings indicate that lower variability of vertex degree and stronger preference for local connectivity in small-world graphs results in slower network neighborhood expansion, and in longer average path length between two arbitrary vertices, which in turn results in better scaling of end system multicast.
Resumo:
We present what we believe to be the first thorough characterization of live streaming media content delivered over the Internet. Our characterization of over five million requests spanning a 28-day period is done at three increasingly granular levels, corresponding to clients, sessions, and transfers. Our findings support two important conclusions. First, we show that the nature of interactions between users and objects is fundamentally different for live versus stored objects. Access to stored objects is user driven, whereas access to live objects is object driven. This reversal of active/passive roles of users and objects leads to interesting dualities. For instance, our analysis underscores a Zipf-like profile for user interest in a given object, which is to be contrasted to the classic Zipf-like popularity of objects for a given user. Also, our analysis reveals that transfer lengths are highly variable and that this variability is due to the stickiness of clients to a particular live object, as opposed to structural (size) properties of objects. Second, based on observations we make, we conjecture that the particular characteristics of live media access workloads are likely to be highly dependent on the nature of the live content being accessed. In our study, this dependence is clear from the strong temporal correlations we observed in the traces, which we attribute to the synchronizing impact of live content on access characteristics. Based on our analyses, we present a model for live media workload generation that incorporates many of our findings, and which we implement in GISMO [19].
Resumo:
In many real world situations, we make decisions in the presence of multiple, often conflicting and non-commensurate objectives. The process of optimizing systematically and simultaneously over a set of objective functions is known as multi-objective optimization. In multi-objective optimization, we have a (possibly exponentially large) set of decisions and each decision has a set of alternatives. Each alternative depends on the state of the world, and is evaluated with respect to a number of criteria. In this thesis, we consider the decision making problems in two scenarios. In the first scenario, the current state of the world, under which the decisions are to be made, is known in advance. In the second scenario, the current state of the world is unknown at the time of making decisions. For decision making under certainty, we consider the framework of multiobjective constraint optimization and focus on extending the algorithms to solve these models to the case where there are additional trade-offs. We focus especially on branch-and-bound algorithms that use a mini-buckets algorithm for generating the upper bound at each node of the search tree (in the context of maximizing values of objectives). Since the size of the guiding upper bound sets can become very large during the search, we introduce efficient methods for reducing these sets, yet still maintaining the upper bound property. We define a formalism for imprecise trade-offs, which allows the decision maker during the elicitation stage, to specify a preference for one multi-objective utility vector over another, and use such preferences to infer other preferences. The induced preference relation then is used to eliminate the dominated utility vectors during the computation. For testing the dominance between multi-objective utility vectors, we present three different approaches. The first is based on a linear programming approach, the second is by use of distance-based algorithm (which uses a measure of the distance between a point and a convex cone); the third approach makes use of a matrix multiplication, which results in much faster dominance checks with respect to the preference relation induced by the trade-offs. Furthermore, we show that our trade-offs approach, which is based on a preference inference technique, can also be given an alternative semantics based on the well known Multi-Attribute Utility Theory. Our comprehensive experimental results on common multi-objective constraint optimization benchmarks demonstrate that the proposed enhancements allow the algorithms to scale up to much larger problems than before. For decision making problems under uncertainty, we describe multi-objective influence diagrams, based on a set of p objectives, where utility values are vectors in Rp, and are typically only partially ordered. These can be solved by a variable elimination algorithm, leading to a set of maximal values of expected utility. If the Pareto ordering is used this set can often be prohibitively large. We consider approximate representations of the Pareto set based on ϵ-coverings, allowing much larger problems to be solved. In addition, we define a method for incorporating user trade-offs, which also greatly improves the efficiency.
“Something isn’t right here”: American exceptionalism and the creative nonfiction of the Vietnam War
Resumo:
In this thesis, I argue that few attempts were as effective in correcting the exceptionalist ethos of the United States than the creative nonfiction written by the veterans and journalists of the Vietnam War. Using critical works on creative nonfiction, I identify the characteristics of the genre that allowed Paul John Eakin to call it ‘a special kind of fiction.’ I summarise a brief history of creative nonfiction to demonstrate how it became a distinctly American form despite its Old World origins. I then claim that it was the genre most suited to the kind of ideological transformation that many hoped to instigate in U.S. society in the aftermath of Vietnam. Following this, the study explores how this “new” myth-making process occurred. I use Tim O’Brien’s If I Die in a Combat Zone and Philip Caputo’s A Rumor of War to illustrate how autobiography/memoir was able to demonstrate the detrimental effect that America’s exceptionalist ideology was having on its population. Utilising narrative and autobiographical theory, I contend that these accounts represented a collective voice which spoke for all Americans in the years after Vietnam. Using Neil Sheehan’s A Bright Shining Lie and C.D.B. Bryan’s Friendly Fire, I illustrate how literary journalism highlighted the hubris of the American government. I contend that while poiesis is an integral attribute of creative nonfiction, by the inclusion of extraneous bibliographic material, authors of the genre could also be seen as creating a literary context predisposing the reader towards an empirical interpretation of the events documented within. Finally, I claim that oral histories were in their essence a synthesis of “everyman” experiences very much in keeping with the American zeitgeist of the early Eighties. Focussing solely on Al Santoli’s Everything We Had, I demonstrate how such polyphonic narratives personalised the history of the Vietnam War.