897 resultados para web application


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web services are software accessible via the Internet that provide functionality to be used by applications. Today, it is natural to reuse third-party services to compose new services. This process of composition can occur in two styles, called orchestration and choreography. A choreography represents a collaboration between services which know their partners in the composition, to achieve the service s desired functionality. On the other hand, an orchestration have a central process (the orchestrator) that coordinates all application operations. Our work is placed in this latter context, by proposing an abstract model for running service orchestrations. For this purpose, a graph reduction machine will be defined for the implementation of service orchestrations specified in a variant of the PEWS composition language. Moreover, a prototype of this machine (in Java) is built as a proof of concept

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of Software Engineering, web accessibility is gaining more room, establishing itself as an important quality attribute. This fact is due to initiatives of institutions such as the W3C (World Wide Web Consortium) and the introduction of norms and laws such as Section 508 that underlie the importance of developing accessible Web sites and applications. Despite these improvements, the lack of web accessibility is still a persistent problem, and could be related to the moment or phase in which this requirement is solved within the development process. From the moment when Web accessibility is generally regarded as a programming problem or treated when the application is already developed entirely. Thus, consider accessibility already during activities of analysis and requirements specification shows itself a strategy to facilitate project progress, avoiding rework in advanced phases of software development because of possible errors, or omissions in the elicitation. The objective of this research is to develop a method and a tool to support requirements elicitation of web accessibility. The strategy for the requirements elicitation of this method is grounded by the Goal-Oriented approach NFR Framework and the use of catalogs NFRs, created based on the guidelines contained in WCAG 2.0 (Web Content Accessibility Guideline) proposed by W3C

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents an User Interface (UI) prototypes generation process to the softwares that has a Web browser as a plataform. This process uses UI components more complex than HTML elements. To described this components more complex this work suggest to use the XICL (eXtensinble User Interface Components Language). XICL is a language, based on XML syntax, to describe UI Components and IUs. XICL promotes extensibility and reusability in the User Interface development process. We have developed two compiler. The first one compiles IMML (Interactive Message Modeling Language) code and generates XICL code. The second one compiles XICL code and generates DHTML code

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capture web of N. clavipes presents viscous droplets, which play important roles in web mechanics and prey capture. By using scanning and transmission electron microscopy, it was demonstrated that the web droplets are constituted of different chemical environments, provided by the existence both of an aqueous and a lipid layer, which, in turn, present a suspension of tenths of vesicles containing polypeptides and/or tipids. GC/EI-MS Analysis of the contents of these vesicles led to the identification of some saturated fatty acids, such as decanoic acid, undecanoic acid, dodecanoic acid, tetradecanoic acid, octadecanoic acid, and icosanoic acid, while other components were unsaturated fatty acids, such as (Z)-tetradec-9-enoic acid, (Z)-octadec-9-enoic acid, and (Z)-icosa-11-enoic acid; and polyunsaturated fatty acids like (9Z,12Z)-octadeca-9,12-dienoic acid, (9Z,12Z,15Z)-octadeca-9,12,15-trienoic acid, and (11Z,14Z)-icosa-11,14-dienoic acid. Toxic proteins such as calcium-activated proteinase and metalloproteinase jararhagin-like precursor were also identified by using a proteomic approach, indicating the possible involvement of these enzymes in the pre-digestion of spiders' preys web-captured. Apparently, the mixture of fatty acids are relatively toxic to insects by topical application (LD50 64.3 +/- 7.6 ng mg(-1) honeybee), while the proteins alone present no topical effect; however, when injected into the prey-insects, these proteins presented a moderate toxicity (LD50 40.3 +/- 4.8 ng mg(-1) honeybee); the mixture of fatty acids and proteins is very toxic to the preys captured by the web droplets of the viscid spiral of Nephila clavipes when topically applied on them (LD50 14.3 +/- 1.8ng mg(-1) honeybee).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel chemical subclass of toxin, [1-(3-diazenylphenyl) ethanol]iron, was identified among the compounds present in the web of the spider Nephila clavipes. This type of compound is not common among natural products, mainly in spider-venom toxins; it was shown to be a potent paralytic and/or lethal toxin applied by the spider over its web to ensure prey capture only by topical application. The structure was elucidated by means of ESI mass spectrometry, H-1-NMR spectroscopy, high-resolution (HR) mass spectrometry, and ICP spectrometry. The structure of [1-( 3-diazenylphenyl)ethanol] iron and the study of its insecticidal action may be used as a starting point for the development of new drugs for pest control in agriculture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service oriented architectures (SOA) based on Simple Object Access Protocol (SOAP) Web services have attracted the attention of enterprises mainly for business-to-business integration and to create composite applications that execute business processes. An existing problem is the lack of preoccupation with non technical users due to the fact that to create a composite application to fulfill users needs, it is necessary to be in contact with IT staff. To overcome this issue, enterprises can take advantage of web 2.0, 'introducing in the development stage some technologies like mashups and some concepts like user empowerment, collaborative work and collective intelligence. Some results [3] [13] have shown how web 2.0 concepts can help non technical users to produce relative complex business processes. However, traditional enterprise requirements goes beyond typical web 2.0 solutions in several aspects: (1) traditional enterprise systems are based on heterogeneous stack of technologies that are not directly exploitable from a web-based client (where SOAP web services play an important role); (2) web browsers set some cross-domain security constraints making difficult to integrate services from diverse domains. In this paper, a contribution to two web 2.0 research projects [14] [15] partially solves the problems described: provide a way to invoke cross-domain backend services (based on SOAP technologies) directly only using clientside languages, without a need for any adaptation layer. © 2010 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Computação - IBILCE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional characteristics and challenges for organizing and searching information on the World Wide Web are outlined and reviewed. The classification features of two of these methods, such as Google, in the case of automated search engines, and Yahoo! Directory, in the case of subject directories are analyzed. Recent advances in the Semantic Web, particularly the growing application of ontologies and Linked Data are also reviewed. Finally, some problems and prospects related to the use of classification and indexing on the World Wide Web are discussed, emphasizing the need of rethinking the role of classification in the organization of these resources and outlining the possibilities of applying Ranganathan's facet theories of classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid growth of the use of Web applications in various fields of knowledge, the term Web service enter into evidence in the current scenario, which refers to services from different origins and purpose, offered through local networks and also available in some cases, on the Internet. The architecture of this type of application offers data processing on server side thereby, running applications and complex and slow processes is very interesting, which is the case with most algorithms involving visualization. The VTK is a library intended for visualization, and features a large variety of methods and algorithms for this purpose, but with a graphics engine that requires processing capacity. The union of these two resources can bring interesting results and contribute for performance improvements in the VTK library. This study is discussed in this project, through testing and communication overhead analysis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of the Internet, coupled with the increasing popularity of dynamically generated content on the World Wide Web, has created the need for more and faster Web servers capable of serving the over 100 million Internet users. Server clustering has emerged as a promising technique to build scalable Web servers. In this article we examine the seminal work, early products, and a sample of contemporary commercial offerings in the field of transparent Web server clustering. We broadly classify transparent server clustering into three categories.