994 resultados para Langage de balisage XML


Relevância:

20.00% 20.00%

Publicador:

Resumo:

On the 15th of April, 1897, a 19 year-old European resident of Baghdad, named Alexander Richard Svoboda, set out on a long journey to Europe by caravan, boat and train. From a large and influential family of merchants, artists, and explorers settled in Ottoman Iraq since the end of the 18th century, Alexander traveled in the company of his parents and a departing British diplomat accompanied by his retinue. They followed a circuitous route through the Middle East to Cairo and thence to Europe on a three and a half month journey which Alexander described day-by-day in a journal written in the Iraqi Arabic of his time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les activités de la vie quotidienne sont indispensables pour le bien-être physique et moral. Pour que celle-ci soit réussie par les personnes ayant un traumatisme crânien cérébral (TCC), une aide technologique s’avère souvent nécessaire. Les troubles cognitifs empêchent les personnes ayant un TCC de rester chez elles. L’assistant culinaire est un outil composé de trois sous-systèmes: assistance, supervision et communication. L’assistant permet aux personnes ayant un TCC de cuisiner un repas chaud en proposant à celles-ci des stratégies d’assistance cognitive. Ce mémoire présente le sous-système de communication, qui établit la communication entre la personne ayant un TCC et l’assistant culinaire. Nous avons utilisé la méthode LUCID pour concevoir le sous-système de communication. Tout d’abord, nous avons commencé par l’étape prospective et d’exploration pour définir les membres de l’équipe et leurs rôles, afin de comprendre le comportement des personnes ayant un TCC. Pour la compréhension des comportements, nous avons utilisé les persona et des scénarios. Nous avons conçu trois persona principaux (TCC sévère, modéré et léger) et deux secondaires (proche aidant et ergothérapeute). Après la validation des persona, nous avons construit deux scénarios, un sans assistance qui permet de comprendre la gradation d’assistance donnée par l’ergothérapeute durant l’évaluation, un avec assistance pour concevoir les fonctionnalités de l’assistant culinaire et d’établir la communication entre les trois sous-systèmes et l’utilisateur. Ensuite, nous avons conçu les interfaces de l’assistant culinaire selon chaque profil des AVQ. Finalement nous avons développé le sous-système de communication en utilisant des actes de langage pour définir la communication entre les trois sous-systèmes de communication et la gradation de l’assistance, des web service REST comme une technologie qui rend chaque sous-système indépendant. L’assistant culinaire est en cours de développement. Enfin, ultérieurement les différentes fonctionnalités de l’assistant seront testées dans l’appartement de laboratoire DOMUS, et ensuite il sera implémenté auprès de personnes ayant un TCC pour une utilisation régulière chez elles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“When cultural life is re-defined as a perpetual round of entertainments, when serious public conversation becomes a form of baby talk, when, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk.” (Postman) The dire tones of Postman quoted in Janet Cramer’s Media, History, Society: A Cultural History of US Media introduce one view that she canvasses, in the debate of the moment, as to where popular culture is heading in the digital age. This is canvassed, less systematically, in Thinking Popular Culture: War Terrorism and Writing by Tara Brabazon, who for example refers to concerns about a “crisis of critical language” that is bothering professionals—journalists and academics or elsewhere—and deplores the advent of the Internet, as a “flattening of expertise in digital environments”.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective The review addresses two distinct sets of issues: 1. specific functionality, interface, and calculation problems that presumably can be fixed or improved; and 2. the more fundamental question of whether the system is close to being ready for ‘commercial prime time’ in the North American market. Findings Many of our comments relate to the first set of issues, especially sections B and C. Sections D and E deal with the second set. Overall, we feel that LCADesign represents a very impressive step forward in the ongoing quest to link CAD with LCA tools and, more importantly, to link the world of architectural practice and that of environmental research. From that perspective, it deserves continued financial support as a research project. However, if the decision is whether or not to continue the development program from a purely commercial perspective, we are less bullish. In terms of the North American market, there are no regulatory or other drivers to press design teams to use a tool of this nature. There is certainly interest in this area, but the tools must be very easy to use with little or no training. Understanding the results is as important in this regard as knowing how to apply the tool. Our comments are fairly negative when it comes to that aspect. Our opinion might change to some degree when the ‘fixes’ are made and the functionality improved. However, as discussed in more detail in the following sections, we feel that the multi-step process — CAD to IFC to LCADesign — could pose a serious problem in terms of market acceptance. The CAD to IFC part is impossible for us to judge with the information provided, and we can’t even begin to answer the question about the ease of using the software to import designs, but it appears cumbersome from what we do know. There does appear to be a developing North American market for 3D CAD, with a recent survey indicating that about 50% of the firms use some form of 3D modeling for about 75% of their projects. However, this does not mean that full 3D CAD is always being used. Our information suggests that AutoDesk accounts for about 75 to 80% of the 3D CAD market, and they are very cautious about any links that do not serve a latent demand. Finally, other system that link CAD to energy simulation are using XML data transfer protocols rather than IFC files, and it is our understanding that the market served by AutoDesk tends in that direction right now. This is a subject that is outside our area of expertise, so please take these comments as suggestions for more intensive market research rather than as definitive findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computational biology increasingly demands the sharing of sophisticated data and annotations between research groups. Web 2.0 style sharing and publication requires that biological systems be described in well-defined, yet flexible and extensible formats which enhance exchange and re-use. In contrast to many of the standards for exchange in the genomic sciences, descriptions of biological sequences show a great diversity in format and function, impeding the definition and exchange of sequence patterns. In this presentation, we introduce BioPatML, an XML-based pattern description language that supports a wide range of patterns and allows the construction of complex, hierarchically structured patterns and pattern libraries. BioPatML unifies the diversity of current pattern description languages and fills a gap in the set of XML-based description languages for biological systems. We discuss the structure and elements of the language, and demonstrate its advantages on a series of applications, showing lightweight integration between the BioPatML parser and search engine, and the SilverGene genome browser. We conclude by describing our site to enable large scale pattern sharing, and our efforts to seed this repository.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the approach taken to the XML Mining track at INEX 2008 by a group at the Queensland University of Technology. We introduce the K-tree clustering algorithm in an Information Retrieval context by adapting it for document clustering. Many large scale problems exist in document clustering. K-tree scales well with large inputs due to its low complexity. It offers promising results both in terms of efficiency and quality. Document classification was completed using Support Vector Machines.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, system integration has grown in popularity as it allows organisations to streamline business processes. Traditionally, system integration has been conducted through point-to-point solutions – as a new integration scenario requirement arises, a custom solution is built between the relevant systems. Bus-based solutions are now preferred, whereby all systems communicate via an intermediary system such as an enterprise service bus, using a common data exchange model. This research investigates the use of a common data exchange model based on open standards, specifically MIMOSA OSA-EAI, for asset management system integration. A case study is conducted that involves the integration of processes between a SCADA, maintenance decision support and work management system. A diverse number of software platforms are employed in developing the final solution, all tied together through MIMOSA OSA-EAI-based XML web services. The lessons learned from the exercise are presented throughout the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Home Automation (HA) has emerged as a prominent ¯eld for researchers and in- vestors confronting the challenge of penetrating the average home user market with products and services emerging from technology based vision. In spite of many technology contri- butions, there is a latent demand for a®ordable and pragmatic assistive technologies for pro-active handling of complex lifestyle related problems faced by home users. This study has pioneered to develop an Initial Technology Roadmap for HA (ITRHA) that formulates a need based vision of 10-15 years, identifying market, product and technology investment opportunities, focusing on those aspects of HA contributing to e±cient management of home and personal life. The concept of Family Life Cycle is developed to understand the temporal needs of family. In order to formally describe a coherent set of family processes, their relationships, and interaction with external elements, a reference model named Fam- ily System is established that identi¯es External Entities, 7 major Family Processes, and 7 subsystems-Finance, Meals, Health, Education, Career, Housing, and Socialisation. Anal- ysis of these subsystems reveals Soft, Hard and Hybrid processes. Rectifying the lack of formal methods for eliciting future user requirements and reassessing evolving market needs, this study has developed a novel method called Requirement Elicitation of Future Users by Systems Scenario (REFUSS), integrating process modelling, and scenario technique within the framework of roadmapping. The REFUSS is used to systematically derive process au- tomation needs relating the process knowledge to future user characteristics identi¯ed from scenarios created to visualise di®erent futures with richly detailed information on lifestyle trends thus enabling learning about the future requirements. Revealing an addressable market size estimate of billions of dollars per annum this research has developed innovative ideas on software based products including Document Management Systems facilitating automated collection, easy retrieval of all documents, In- formation Management System automating information services and Ubiquitous Intelligent System empowering the highly mobile home users with ambient intelligence. Other product ideas include robotic devices of versatile Kitchen Hand and Cleaner Arm that can be time saving. Materialisation of these products require technology investment initiating further research in areas of data extraction, and information integration as well as manipulation and perception, sensor actuator system, tactile sensing, odour detection, and robotic controller. This study recommends new policies on electronic data delivery from service providers as well as new standards on XML based document structure and format.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The loosely-coupled and dynamic nature of web services architectures has many benefits, but also leads to an increased vulnerability to denial of service attacks. While many papers have surveyed and described these vulnerabilities, they are often theoretical and lack experimental data to validate them, and assume an obsolete state of web services technologies. This paper describes experiments involving several denial of service vulnerabilities in well-known web services platforms, including Java Metro, Apache Axis, and Microsoft .NET. The results both confirm and deny the presence of some of the most well-known vulnerabilities in web services technologies. Specifically, major web services platforms appear to cope well with attacks that target memory exhaustion. However, attacks targeting CPU-time exhaustion are still effective, regardless of the victim’s platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Denial of Service Testing Framework (dosTF) being developed as part of the joint India-Australia research project for ‘Protecting Critical Infrastructure from Denial of Service Attacks’ allows for the construction, monitoring and management of emulated Distributed Denial of Service attacks using modest hardware resources. The purpose of the testbed is to study the effectiveness of different DDoS mitigation strategies and to allow for the testing of defense appliances. Experiments are saved and edited in XML as abstract descriptions of an attack/defense strategy that is only mapped to real resources at run-time. It also provides a web-application portal interface that can start, stop and monitor an attack remotely. Rather than monitoring a service under attack indirectly, by observing traffic and general system parameters, monitoring of the target application is performed directly in real time via a customised SNMP agent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Originally launched in 2005 with a focus on user-generated content, YouTube has become the dominant platform for online video worldwide, and an important location for some of the most significant trends and controversies in the contemporary new-media environment. Throughout its very short history, it has also intersected with and been the focus of scholarly debates related to the politics, economics, and cultures of the new media—in particular, the “participatory turn” associated with “Web 2.0” business models’ partial reliance on amateur content and social networking. Given the slow pace of traditional scholarly publishing, the body of media and cultural studies literature substantively dedicated to describing and critically understanding YouTube’s texts, practices, and politics is still small, but it is growing steadily. At the same time, since its inception scholars from a wide range of disciplines and critical perspectives have found YouTube useful as a source of examples and case studies, some of which are included here; others have experimented directly with the scholarly and educational potential of the platform itself. For these reasons, although primarily based around the traditional publishing outlets for media, Internet, and cultural studies, this bibliography draws eclectically on a wide range of sources—including sources very closely associated with the web business literature and with the YouTube community itself.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper gives an overview of the INEX 2009 Ad Hoc Track. The main goals of the Ad Hoc Track were three-fold. The first goal was to investigate the impact of the collection scale and markup, by using a new collection that is again based on a the Wikipedia but is over 4 times larger, with longer articles and additional semantic annotations. For this reason the Ad Hoc track tasks stayed unchanged, and the Thorough Task of INEX 2002–2006 returns. The second goal was to study the impact of more verbose queries on retrieval effectiveness, by using the available markup as structural constraints—now using both the Wikipedia’s layout-based markup, as well as the enriched semantic markup—and by the use of phrases. The third goal was to compare different result granularities by allowing systems to retrieve XML elements, ranges of XML elements, or arbitrary passages of text. This investigates the value of the internal document structure (as provided by the XML mark-up) for retrieving relevant information. The INEX 2009 Ad Hoc Track featured four tasks: For the Thorough Task a ranked-list of results (elements or passages) by estimated relevance was needed. For the Focused Task a ranked-list of non-overlapping results (elements or passages) was needed. For the Relevant in Context Task non-overlapping results (elements or passages) were returned grouped by the article from which they came. For the Best in Context Task a single starting point (element start tag or passage start) for each article was needed. We discuss the setup of the track, and the results for the four tasks.