718 resultados para Intuitive
Resumo:
Com o forte crescimento tecnológico tem-se verificado, cada vez mais, o crescente aumento de Web sites, voltados para o contacto directo com o consumidor, onde é possível procurar, escolher e adquirir um determinado produto I serviço directamente ao fornecedor. As plataformas de suporte à pesquisa de produtos de turismo possibilitam a entrega de um grande volume de dados mediante a pesquisa efectuada. Esta informação é, na sua maior parte, informação textual e é geralmente disponibilizada ao utilizador (via uma página Web) sob a forma de uma listagem ou tabela. Este trabalho incide então sobre a proposta e implementação de um conjunto de interfaces com elevada ergonomia, que permitam uma interacção simples e intuitiva entre o utilizador e os dados de turismo, mais concretamente com reservas de aviação e hotel. Para este propósito foi desenvolvida uma Rich Internet Application que permite ao utilizador o ajuste de determinados factores, que influenciam estes produtos de turismo, de modo a encontrar com maior facilidade o que mais se enquadra nos seus gostos I objectivos no vasto conjunto de resultados da pesquisa. ABSTRACT; With the constant growth of information technology tools we have witnessed increase in the number of web Sites aimed to direct contact with the consumer, where users can search, select and purchase a product/service directly to the supplier. The platforms that support the search for tourism data enable the delivery of a large volume of data, as a result of the search that was done. This information is mostly textual and is, in general available to the user (via a web page) in a form of a large list or table. This work focuses on a proposal and implementation of a set of interfaces with high ergonomics. allowing a simple and intuitive interaction between the user and the tourism data, more specifically with air fares and hotels reservations. For this purpose was developed an Rich Internet Application that allows the user to adjust some factors that influence the tourism products, in order to more accurately find what fits their tastes objectives in the wide range of search results.
Resumo:
The search for patterns or motifs in data represents a problem area of key interest to finance and economic researchers. In this paper we introduce the Motif Tracking Algorithm, a novel immune inspired pattern identification tool that is able to identify unknown motifs of a non specified length which repeat within time series data. The power of the algorithm comes from the fact that it uses a small number of parameters with minimal assumptions regarding the data being examined or the underlying motifs. Our interest lies in applying the algorithm to financial time series data to identify unknown patterns that exist. The algorithm is tested using three separate data sets. Particular suitability to financial data is shown by applying it to oil price data. In all cases the algorithm identifies the presence of a motif population in a fast and efficient manner due to the utilisation of an intuitive symbolic representation. The resulting population of motifs is shown to have considerable potential value for other applications such as forecasting and algorithm seeding.
Resumo:
Alors que les activités anthropiques font basculer de nombreux écosystèmes vers des régimes fonctionnels différents, la résilience des systèmes socio-écologiques devient un problème pressant. Des acteurs locaux, impliqués dans une grande diversité de groupes — allant d’initiatives locales et indépendantes à de grandes institutions formelles — peuvent agir sur ces questions en collaborant au développement, à la promotion ou à l’implantation de pratiques plus en accord avec ce que l’environnement peut fournir. De ces collaborations répétées émergent des réseaux complexes, et il a été montré que la topologie de ces réseaux peut améliorer la résilience des systèmes socio-écologiques (SSÉ) auxquels ils participent. La topologie des réseaux d’acteurs favorisant la résilience de leur SSÉ est caractérisée par une combinaison de plusieurs facteurs : la structure doit être modulaire afin d’aider les différents groupes à développer et proposer des solutions à la fois plus innovantes (en réduisant l’homogénéisation du réseau), et plus proches de leurs intérêts propres ; elle doit être bien connectée et facilement synchronisable afin de faciliter les consensus, d’augmenter le capital social, ainsi que la capacité d’apprentissage ; enfin, elle doit être robuste, afin d’éviter que les deux premières caractéristiques ne souffrent du retrait volontaire ou de la mise à l’écart de certains acteurs. Ces caractéristiques, qui sont relativement intuitives à la fois conceptuellement et dans leur application mathématique, sont souvent employées séparément pour analyser les qualités structurales de réseaux d’acteurs empiriques. Cependant, certaines sont, par nature, incompatibles entre elles. Par exemple, le degré de modularité d’un réseau ne peut pas augmenter au même rythme que sa connectivité, et cette dernière ne peut pas être améliorée tout en améliorant sa robustesse. Cet obstacle rend difficile la création d’une mesure globale, car le niveau auquel le réseau des acteurs contribue à améliorer la résilience du SSÉ ne peut pas être la simple addition des caractéristiques citées, mais plutôt le résultat d’un compromis subtil entre celles-ci. Le travail présenté ici a pour objectifs (1), d’explorer les compromis entre ces caractéristiques ; (2) de proposer une mesure du degré auquel un réseau empirique d’acteurs contribue à la résilience de son SSÉ ; et (3) d’analyser un réseau empirique à la lumière, entre autres, de ces qualités structurales. Cette thèse s’articule autour d’une introduction et de quatre chapitres numérotés de 2 à 5. Le chapitre 2 est une revue de la littérature sur la résilience des SSÉ. Il identifie une série de caractéristiques structurales (ainsi que les mesures de réseaux qui leur correspondent) liées à l’amélioration de la résilience dans les SSÉ. Le chapitre 3 est une étude de cas sur la péninsule d’Eyre, une région rurale d’Australie-Méridionale où l’occupation du sol, ainsi que les changements climatiques, contribuent à l’érosion de la biodiversité. Pour cette étude de cas, des travaux de terrain ont été effectués en 2010 et 2011 durant lesquels une série d’entrevues a permis de créer une liste des acteurs de la cogestion de la biodiversité sur la péninsule. Les données collectées ont été utilisées pour le développement d’un questionnaire en ligne permettant de documenter les interactions entre ces acteurs. Ces deux étapes ont permis la reconstitution d’un réseau pondéré et dirigé de 129 acteurs individuels et 1180 relations. Le chapitre 4 décrit une méthodologie pour mesurer le degré auquel un réseau d’acteurs participe à la résilience du SSÉ dans lequel il est inclus. La méthode s’articule en deux étapes : premièrement, un algorithme d’optimisation (recuit simulé) est utilisé pour fabriquer un archétype semi-aléatoire correspondant à un compromis entre des niveaux élevés de modularité, de connectivité et de robustesse. Deuxièmement, un réseau empirique (comme celui de la péninsule d’Eyre) est comparé au réseau archétypique par le biais d’une mesure de distance structurelle. Plus la distance est courte, et plus le réseau empirique est proche de sa configuration optimale. La cinquième et dernier chapitre est une amélioration de l’algorithme de recuit simulé utilisé dans le chapitre 4. Comme il est d’usage pour ce genre d’algorithmes, le recuit simulé utilisé projetait les dimensions du problème multiobjectif dans une seule dimension (sous la forme d’une moyenne pondérée). Si cette technique donne de très bons résultats ponctuellement, elle n’autorise la production que d’une seule solution parmi la multitude de compromis possibles entre les différents objectifs. Afin de mieux explorer ces compromis, nous proposons un algorithme de recuit simulé multiobjectifs qui, plutôt que d’optimiser une seule solution, optimise une surface multidimensionnelle de solutions. Cette étude, qui se concentre sur la partie sociale des systèmes socio-écologiques, améliore notre compréhension des structures actorielles qui contribuent à la résilience des SSÉ. Elle montre que si certaines caractéristiques profitables à la résilience sont incompatibles (modularité et connectivité, ou — dans une moindre mesure — connectivité et robustesse), d’autres sont plus facilement conciliables (connectivité et synchronisabilité, ou — dans une moindre mesure — modularité et robustesse). Elle fournit également une méthode intuitive pour mesurer quantitativement des réseaux d’acteurs empiriques, et ouvre ainsi la voie vers, par exemple, des comparaisons d’études de cas, ou des suivis — dans le temps — de réseaux d’acteurs. De plus, cette thèse inclut une étude de cas qui fait la lumière sur l’importance de certains groupes institutionnels pour la coordination des collaborations et des échanges de connaissances entre des acteurs aux intérêts potentiellement divergents.
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
Resumo:
The interest to small and media size enterprises’ (SMEs) internationalization process is increasing with a growth of SMEs’ contribution to GDP. Internet gives an opportunity to provide variety of services online and reach market niche worldwide. The overlapping of SMEs’ internationalization and online services is the main issue of the research. The most SMEs internationalize according to intuitive decisions of CEO of the company and lose limited resources to worthless attempts. The purpose of this research is to define effective approaches to online service internationalization and selection of the first international market. The research represents single holistic case study of local massive open online courses (MOOCs) platform going global. It considers internationalization costs and internationalization theories applicable to online services. The research includes preliminary screening of the markets and in-depth analysis based on macro parameters of the market and specific characteristics of the customers and expert evaluation of the results. The specific issues as GILT (Globalization, Internationalization, Localization and Translation) approach and Internet-enabled internationalization are considered. The research results include recommendations on international market selection methodology for online services and for effective internationalization strategy development.
Resumo:
I approach my practice through the truth that art is inseparable from reality. Reducing art to a single idea is an unnatural limitation because the creative process and its manifestations result from many parallel ideas, instincts, emotions and reflections. In the following, I trace the central sources of the inspiration for work and attempt to bridge the experiential and intuitive processes that concurrently fuel my creative process.
Resumo:
We investigate the Becker-Döring model of nucleation with three generalisations; an input of monomer, an input of inhibitor and finally, we allow the monomers to form two morphologies of cluster. We assume size-independent aggregation and fragmentation rates. Initially we consider the problem of constant monomer input and determine the steady-state solution approached in the large-time limit, and the manner in which it is approached. Secondly, in addition to a constant input of monomer we allow a constant input of inhibitor, which prevents clusters growing any larger and this removes them from the kinetics of the process; the inhibitor is consumed in the action of poisoning a cluster. We determine a critical ratio of poison to monomer input below which the cluster concentrations tend to a non-zero steady-state solution and the poison concentration tends to a finite value. Above the critical input ratio, the concentrations of all cluster sizes tend to zero and the poison concentration grows without limit. In both cases the solution in the large-time limit is determined. Finally we consider a model where monomers form two morphologies, but the inhibitor only acts on one morphology. Four cases are identified, depending on the relative poison to monomer input rates and the relative thermodynamic stability. In each case we determine the final cluster distribution and poison concentration. We find that poisoning the less stable cluster type can have a significant impact on the structure of the more stable cluster distribution; a counter-intuitive result. All results are shown to agree with numerical simulation.
Resumo:
My thesis explores the formation of the subject in the novels of Faulkner’s Go Down, Moses, Toni Morrison’s Song of Solomon, and Gloria Naylor’s Mama Day. I attach the concept of property in terms of how male protagonists are obsessed with materialistic ownership and with the subordination of women who, as properties, consolidate their manhood. The three novelists despite their racial, gendered, and literary differences share the view that identity and truth are mere social and cultural constructs. I incorporate the work of Judith Butler and other poststructuralist figures, who see identity as a matter of performance rather than a natural entity. My thesis explores the theme of freedom, which I attached to the ways characters use their bodies either to confine or to emancipate themselves from the restricting world of race, class, and gender. The three novelists deconstruct any system of belief that promulgates the objectivity of truth in historical documents. History in the three novels, as with the protagonists, perception of identity, remains a social construct laden with distortions to serve particular political or ideological agendas. My thesis gives voice to African American female characters who are associated with love and racial and gender resistance. They become the reservoirs of the African American legacy in terms of their association with the oral and intuitionist mode of knowing, which subverts the male characters’ obsession with property and with the mainstream empiricist world. In this dissertation, I use the concept of hybridity as a literary and theoretical devise that African-American writers employ. In effect, I embark on the postcolonial studies of Henry Louise Gates, Paul Gilroy, W. E. B Du Bois, James Clifford, and Arjun Appadurai in order to reflect upon the fluidity of Morrison’s and Naylor’s works. I show how these two novelists subvert Faulkner’s essentialist perception of truth, and of racial and gendered identity. They associate the myth of the Flying African with the notion of hybridity by making their male protagonists criss-cross Northern and Southern regions. I refer to Mae Gwendolyn Henderson’s article on “Speaking in Tongues” in my analysis of how Naylor subverts the patriarchal text of both Faulkner and Morrison in embarking on a more feminine version of the flying African, which she relates to an ex-slave, Sapphira Wade, a volatile female character who resists fixed claim over her story and identity. In dealing with the concept of hybridity, I show that Naylor rewrites both authors’ South by making Willow Springs a more fluid space, an assumption that unsettles the scores of critics who associate the island with authenticity and exclusive rootedness.
Resumo:
In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance -- They allow to save time and to avoid errors during part programming and permit code re-usage -- Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility -- In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while) -- Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability -- Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs -- Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions
Resumo:
According to a traditional rationalist proposal, it is possible to attain knowledge of certain necessary truths by means of insight—an epistemic mental act that combines the 'presentational' character of perception with the a priori status usually reserved for discursive reasoning. In this dissertation, I defend the insight proposal in relation to a specific subject matter: elementary Euclidean plane geometry, as set out in Book I of Euclid's Elements. In particular, I argue that visualizations and visual experiences of diagrams allow human subjects to grasp truths of geometry by means of visual insight. In the first two chapters, I provide an initial defense of the geometrical insight proposal, drawing on a novel interpretation of Plato's Meno to motivate the view and to reply to some objections. In the remaining three chapters, I provide an account of the psychological underpinnings of geometrical insight, a task that requires considering the psychology of visual imagery alongside the details of Euclid's geometrical system. One important challenge is to explain how basic features of human visual representations can serve to ground our intuitive grasp of Euclid's postulates and other initial assumptions. A second challenge is to explain how we are able to grasp general theorems by considering diagrams that depict only special cases. I argue that both of these challenges can be met by an account that regards geometrical insight as based in visual experiences involving the combined deployment of two varieties of 'dynamic' visual imagery: one that allows the subject to visually rehearse spatial transformations of a figure's parts, and another that allows the subject to entertain alternative ways of structurally integrating the figure as a whole. It is the interplay between these two forms of dynamic imagery that enables a visual experience of a diagram, suitably animated in visual imagination, to justify belief in the propositions of Euclid’s geometry. The upshot is a novel dynamic imagery account that explains how intuitive knowledge of elementary Euclidean plane geometry can be understood as grounded in visual insight.
Resumo:
This thesis examines the short-term impact of credit rating announcements on daily stock returns of 41 European banks indexed in STOXX Europe 600 Banks. The time period of this study is 2002–2015 and the ratings represent long-term issuer ratings provided by S&P, Moody’s and Fitch. Bank ratings are significant for a bank’s operation costs so it is interesting to investigate how investors react to changes in creditworthiness. The study objective is achieved by conducting an event study. The event study is extended with a cross-sectional linear regression to investigate other potential determinants surrounding rating changes. The research hypotheses and the motivation for additional tests are derived from prior research. The main hypotheses are formed to explore whether rating changes have an effect on stock returns, when this possible reaction occurs and whether it is asymmetric between upgrades and downgrades. The findings provide evidence that rating announcements have an impact on stock returns in the context of European banks. The results also support the existence of an asymmetry in capital market reaction to rating upgrades and downgrades. The rating downgrades are associated with statistically significant negative abnormal returns on the event day although the reaction is rather modest. No statistically significant reaction is found associated with the rating upgrades on the event day. These results hold true with both rating changes and rating watches. No anticipation is observed in the case of rating changes but there is a statistically significant cumulative negative (positive) price reaction occurring before the event day for negative (positive) watch announcements. The regression provides evidence that the stock price reaction is stronger for rating downgrades occurring within below investment grade class compared with investment grade class. This is intuitive as investors are more concerned about their investments in lower-rated companies. Besides, the price reaction of larger banks is more mitigated compared with smaller banks in the case of rating downgrades. The reason for this may be that larger banks are usually more widely followed by the public. However, the study results may also provide evidence of the existence of the so-called “too big to fail” subsidy that dampens the negative returns of larger banks.
Resumo:
Sub-wavelength structures are enabling the design of devices based in dielectric waveguides with unprecedented performance in both the near-infrared and mid-infrared wavelength regions. These devices include fiber-to-chip grating couplers with sub-decibel efficiency, waveguide couplers with bandwidths of several hundred nanometers, and low loss suspended waveguides. Here we will report our progress in the electromagnetic modelling and simulation of sub-wavelength structures, providing at the same time an intuitive vision of their fundamental optical properties. Furthermore, we will address design strategies for several integrated optical devices based on these structures, and present the latest experimental results for structures operating both at near and mid-infrared wavelengths.
Resumo:
55 p.
Resumo:
The amount of information contained within the Internet has exploded in recent decades. As more and more news, blogs, and many other kinds of articles that are published on the Internet, categorization of articles and documents are increasingly desired. Among the approaches to categorize articles, labeling is one of the most common method; it provides a relatively intuitive and effective way to separate articles into different categories. However, manual labeling is limited by its efficiency, even thought the labels selected manually have relatively high quality. This report explores the topic modeling approach of Online Latent Dirichlet Allocation (Online-LDA). Additionally, a method to automatically label articles with their latent topics by combining the Online-LDA posterior with a probabilistic automatic labeling algorithm is implemented. The goal of this report is to examine the accuracy of the labels generated automatically by a topic model and probabilistic relevance algorithm for a set of real-world, dynamically updated articles from an online Rich Site Summary (RSS) service.
Resumo:
Traditional decision making research has often focused on one's ability to choose from a set of prefixed options, ignoring the process by which decision makers generate courses of action (i.e., options) in-situ (Klein, 1993). In complex and dynamic domains, this option generation process is particularly critical to understanding how successful decisions are made (Zsambok & Klein, 1997). When generating response options for oneself to pursue (i.e., during the intervention-phase of decision making) previous research has supported quick and intuitive heuristics, such as the Take-The-First heuristic (TTF; Johnson & Raab, 2003). When generating predictive options for others in the environment (i.e., during the assessment-phase of decision making), previous research has supported the situational-model-building process described by Long Term Working Memory theory (LTWM; see Ward, Ericsson, & Williams, 2013). In the first three experiments, the claims of TTF and LTWM are tested during assessment- and intervention-phase tasks in soccer. To test what other environmental constraints may dictate the use of these cognitive mechanisms, the claims of these models are also tested in the presence and absence of time pressure. In addition to understanding the option generation process, it is important that researchers in complex and dynamic domains also develop tools that can be used by `real-world' professionals. For this reason, three more experiments were conducted to evaluate the effectiveness of a new online assessment of perceptual-cognitive skill in soccer. This test differentiated between skill groups and predicted performance on a previously established test and predicted option generation behavior. The test also outperformed domain-general cognitive tests, but not a domain-specific knowledge test when predicting skill group membership. Implications for theory and training, and future directions for the development of applied tools are discussed.