896 resultados para Web log analysis
Resumo:
With the increasing importance of digital communication and its distinct characteristics, marketing tools and strategies adopted by companies have changed dramatically. Among the many digital marketing tools and new media channels available for marketers, the phenomenon known as social media is one of the most complex and enigmatic. It has a range that still is quite unexplored and deeply transforms the present view on the promotion mix (Mangold & Faulds, 2009). Conversations among users on social media directly affect their perceptions on products, services and brands. But more than that, a wide range of other subjects can also become topics of conversations on social media. Hit songs, sporting events, celebrity news and even natural disasters and politics are topics that often become viral on the web. Thus, companies must grasp that, and in order to become more interesting and relevant, they must take part in these conversations inserting their brands in these online dynamic dialogues. This paper focuses on how these social interactions are manifested in the web in to two distinct cultures, Brazil and China. By understanding the similarities and differences of these cultures, this study helps firms to better adjust its marketing efforts across regions, targeting and positioning themselves, not only geographically and culturally, but also across different web platforms (Facebook and RenRen). By examining how companies should focus their efforts according to each segment in social media, firms can also maximize its results in communication and mitigate risks. The findings suggest that differences in cultural dimensions in these two countries directly affect their virtual social networking behavior in many dimensions (Identity, Presence, Relationships, Reputation, Groups, Conversations and Sharing). Accordingly, marketing efforts must be tailored to each comportment and expectations.
Resumo:
VANTI, Nadia. Mapeamento das Instituições Federais de Ensino Superior da Região Nordeste do Brasil na Web. Informação & informação, Londrina, v. 15, p. 55-67, 2010
Resumo:
We did a numerical investigation of the propagation of short light pulses in the region of 1.55 mu m and the conversion efficiency (CE) for the four wave mixing generation (FWM) of ordinary and dispersion decreasing fibers for use in wavelength division multiplexing (WDM) systems, Our simulations studies three different profiles, linear, hyperbolic. and constant, One conclude that for all the profiles there is decrease of the conversion efficiency with the increase in the channel separation. The hyperbolic profile present a higher efficiency of around 1000 above in magnitude compared with the others profiles at 0.2 nm of channel separation. We calculate the conversion efficiency versus the fiber length for the three profiles. The conversion efficiency for the hyperbolic profile is higher when compared to the constant and linear profiles. The other interesting point of the hyperbolic profile is that the increase of the CE in the beginning of the fiber does not show my oscillation in the CE value (log eta), which was observed for the constant and linear profiles. For all the profiles there is an increase of the conversion efficiency with the increase of the pump power. The compression factor C-i for the generated FWM signal at omega(3) was measured along the DDF's and the constant profile fibers. One can conclude that with the use of decreasing dispersion profile (DDF) fibers one can have a control of the (CE) conversion efficiency and the compression factor of the four wave mixing (FWM) generation in WDM systems. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
This paper is the result of the Masters dissertation studying the role and history of scientific communication, especially the changes that have occurred after the appearance of electronic communication and computer networks. This study showed that hypertext systems are increasingly being used in the scientific and academic world in the production of electronic journals; this makes it possible for the user to rapidly access information in their area. However, these systems need to be improved to help the user during search and access to information. Both printed journals migrating to electronic media, and the exclusively electronic journals should present the current quality indicators. The attempt was made to discover whether characteristics related to printed journals are being maintained in their electronic counterparts. For this, a prototype model was developed to analyze the structure of electronic scientific journals; it composes 14 criteria expressing aspects of quality for these journals. It includes elements of Website Information Architecture and those already in place in printed scientific journals in order to ensure that basic functions - archiving and dissemination - are maintained in electronic publishing. Each criterion consists of variables, which measure the maintenance of these functions both in the migrating printed journals and the exclusively electronic ones. This prototype model was used to analyze Ciência da Informação On-line and DataGramaZero - Revista de Ciência da Informação. Results indicate that this model is able to find out if the basic functions of archiving and dissemination are being maintained in electronic journals. Therefore, its implementation is justified in electronic journals. The model can help librarians, authors, and users of electronic journals to identify quality journals, and assist editors in developing their projects. The material from the study may be included in the preservice and inservice education of Information Science professionals and to support editors of scientific journals.
Resumo:
Soil aggregation is an index of soil structure measured by mean weight diameter (MWD) or scaling factors often interpreted as fragmentation fractal dimensions (D-f). However, the MWD provides a biased estimate of soil aggregation due to spurious correlations among aggregate-size fractions and scale-dependency. The scale-invariant D-f is based on weak assumptions to allow particle counts and sensitive to the selection of the fractal domain, and may frequently exceed a value of 3, implying that D-f is a biased estimate of aggregation. Aggregation indices based on mass may be computed without bias using compositional analysis techniques. Our objective was to elaborate compositional indices of soil aggregation and to compare them to MWD and D-f using a published dataset describing the effect of 7 cropping systems on aggregation. Six aggregate-size fractions were arranged into a sequence of D-1 balances of building blocks that portray the process of soil aggregation. Isometric log-ratios (ilrs) are scale-invariant and orthogonal log contrasts or balances that possess the Euclidean geometry necessary to compute a distance between any two aggregation states, known as the Aitchison distance (A(x,y)). Close correlations (r>0.98) were observed between MWD, D-f, and the ilr when contrasting large and small aggregate sizes. Several unbiased embedded ilrs can characterize the heterogeneous nature of soil aggregates and be related to soil properties or functions. Soil bulk density and penetrater resistance were closely related to A(x,y) with reference to bare fallow. The A(x,y) is easy to implement as unbiased index of soil aggregation using standard sieving methods and may allow comparisons between studies. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The spread of the Web boosted the dissemination of Information Systems (IS) based on the Web. In order to support the implementation of these systems, several technologies came up or evolved with this purpose, namely the programming languages. The Technology Acceptance Model TAM (Davis, 1986) was conceived aiming to evaluate the acceptance/use of information technologies by their users. A lot of studies and many applications have used the TAM, however, in the literature it was not found a mention of the use of such model related to the use of programming languages. This study aims to investigate which factors influence the use of programming languages on the development of Web systems by their developers, applying an extension of the TAM, proposed in this work. To do so, a research was done with Web developers in two Yahoo groups: java-br and python-brasil, where 26 Java questionnaires and 39 Python questionnaires were fully answered. The questionnaire had general questions and questions which measured intrinsic and extrinsic factors of the programming languages, the perceived usefulness, the perceived ease of use, the attitude toward the using and the programming language use. Most of the respondents were men, graduate, between 20 and 30 years old, working in the southeast and south regions. The research was descriptive in the sense of its objectives. Statistical tools, descriptive statistics, main components and linear regression analysis were used for the data analysis. The foremost research results were: Java and Python have machine independence, extensibility, generality and reliability; Java and Python are more used by corporations and international organizations than supported by the government or educational institutions; there are more Java programmers than Python programmers; the perceived usefulness is influenced by the perceived ease of use; the generality and the extensibility are intrinsic factors of programming languages which influence the perceived ease of use; the perceived ease of use influences the attitude toward the using of the programming language
Resumo:
This dissertation of Mestrado investigated the performance and quality of web sites. The target of the research is the proposal of an integrated model of evaluation of services of digital information in web educational sites. The universe of the research was constituted by eighteen Brazilian Universities that offer after-graduation courses, in the levels of mestrado and doutorado in the area of Engineering of Production. The adopted methodology was a descriptive and exploratory research, using the technique of systematic comment and focus group, for the collection of the data, using itself changeable independent dependents and, through the application of two instruments of research. The analysis protocol was the instrument adopted for evaluation and attainment of qualitative results. E the analysis grating was applied for evaluation and attainment of the quantitative results. The qualitative results had identified to the lack of standardization of web sites, under the attributes of content, hierarchy of information, design of the colors and letters. It of accessibility for carriers of auditory and visual special necessities was observed inexistence, as well as the lack of convergence of medias and assistivas technologies. The language of the sites also was evaluated and all present Portuguese only language. The general result demonstrated in grafico and tables with classification of the Universities, predominating the Good note As for the quantitative results, analysis method ed was estatistico, in order to get the descriptive and inferencial result between the dependent and independent variaveis. How much a category of analysis of the services of the evaluated sites, was found it props up and the index generality weighed. These results had served of base for ranking of existence or inexistence the Universities, how much of the information of services in its web sites. In analysis inferencial the result of the test of correlation or association of the independent variaveis (level, concept of the CAPES and period of existence of the program) with the caracteristicas, called was gotten categories of services. For this analysis the estatisticos methods had been used: coefficient of Spearman and the Test of Fisher. But the category you discipline of the Program of Mestrado presented significance with variavel independent and concept of the CAPES. Main conclusion of this study it was ausencia of satandardization o how much to the subjective aspects, design, hierarchy of information navigability and content precision and the accessibility inexistence and convergence. How much to the quantitative aspects, the information services offered by web sites of the evaluated Universities, still they do not present a satisfactory and including quality. Absence of strategies, adoption of tools web, techniques of institucional marketing and services that become them more interactive, navigable is perceived and with aggregate value
Resumo:
The use of Geographic Information Systems (GIS) has becoming very important in fields where detailed and precise study of earth surface features is required. Applications in environmental protection are such an example that requires the use of GIS tools for analysis and decision by managers and enrolled community of protected areas. In this specific field, a challenge that remains is to build a GIS that can be dynamically fed with data, allowing researchers and other agents to recover actual and up to date information. In some cases, data is acquired in several ways and come from different sources. To solve this problem, some tools were implemented that includes a model for spatial data treatment on the Web. The research issues involved start with the feeding and processing of environmental control data collected in-loco as biotic and geological variables and finishes with the presentation of all information on theWeb. For this dynamic processing, it was developed some tools that make MapServer more flexible and dynamic, allowing data uploading by the proper users. Furthermore, it was also developed a module that uses interpolation to aiming spatial data analysis. A complex application that has validated this research is to feed the system with data coming from coral reef regions located in northeast of Brazil. The system was implemented using the best interactivity concept provided by the AJAX model and resulted in a substantial contribution for efficiently accessing information, being an essential mechanism for controlling events in the environmental monitoring
Resumo:
This study aims to analyze the communication graphics of layouts of hypermedia interfaces oriented to Distance Education via the Internet. This proposal is justified by widening the offer of courses that modality and the consequent application of items of hypermedia for teaching-learning. The method of analysis involved the search nethnographic, addressed to the cycle student intermediary of the Training Program Continuing Medias in Education, and the evaluation heuristic of the interfaces of Virtual Learning Environment "E-Proinfo" and of the modules of the Cycle. This evaluation we observed the implementation of the attributes of usability and the degree of interactivity of each interface. The results revealed an inefficient implementation of the attributes of usability, which meant a consequent reduction of the levels of interactivity. As proposing the present Design Virtual Learning, a model of hypermedia layout, designed to generate usability for Virtual learning environments and extend the acquisition of literancy for students and tutors. This proposal design not hypermedia aims the demarcation of models pre-conceived, but the proposal of layout in which each element of hypermedia is applied with a view to generate a seaworthiness intuitive, more agile and efficient, in these ambients
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In the context of Software Engineering, web accessibility is gaining more room, establishing itself as an important quality attribute. This fact is due to initiatives of institutions such as the W3C (World Wide Web Consortium) and the introduction of norms and laws such as Section 508 that underlie the importance of developing accessible Web sites and applications. Despite these improvements, the lack of web accessibility is still a persistent problem, and could be related to the moment or phase in which this requirement is solved within the development process. From the moment when Web accessibility is generally regarded as a programming problem or treated when the application is already developed entirely. Thus, consider accessibility already during activities of analysis and requirements specification shows itself a strategy to facilitate project progress, avoiding rework in advanced phases of software development because of possible errors, or omissions in the elicitation. The objective of this research is to develop a method and a tool to support requirements elicitation of web accessibility. The strategy for the requirements elicitation of this method is grounded by the Goal-Oriented approach NFR Framework and the use of catalogs NFRs, created based on the guidelines contained in WCAG 2.0 (Web Content Accessibility Guideline) proposed by W3C
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Three bradykinin-related peptides (nephilakinins-I to -III) and bradykinin itself were isolated from the aqueous washing extract of the capture web of the spider Nephila clavipes by gel permeation chromatography on a Sephacryl S-100 column, followed by chromatography in a Hi-Trap Sephadex-G25 Superfine column. The novel peptides occur-red in low concentrations and were sequenced through ESI-MS/MS analysis: nephilakinin-I (G-P-N-P-G-F-S-P-F-R-NH2), nephilakinin-Il (E-A-P-P-G-F-S-P-F-R-NH2) and nephilakinin-III (P-S-P-P-G-F-S-P-F-R-NH2)- Synthetic peptides replicated the novel bradykinin-related peptides, which were submitted to biological characterizations. Nephilakinins were shown to cause constriction on isolated rat ileum preparations and relaxation on rat duodenum muscle preparations at amounts higher than bradykinin; apparently these peptides constitute B-2-type agonists of ileal and duodenal smooth muscles. All peptides including the bradykinin were moderately lethal to honeybees. These bradykinin peptides may be related to the predation of insects by the webs of N. clauipes. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The capture web of N. clavipes presents viscous droplets, which play important roles in web mechanics and prey capture. By using scanning and transmission electron microscopy, it was demonstrated that the web droplets are constituted of different chemical environments, provided by the existence both of an aqueous and a lipid layer, which, in turn, present a suspension of tenths of vesicles containing polypeptides and/or tipids. GC/EI-MS Analysis of the contents of these vesicles led to the identification of some saturated fatty acids, such as decanoic acid, undecanoic acid, dodecanoic acid, tetradecanoic acid, octadecanoic acid, and icosanoic acid, while other components were unsaturated fatty acids, such as (Z)-tetradec-9-enoic acid, (Z)-octadec-9-enoic acid, and (Z)-icosa-11-enoic acid; and polyunsaturated fatty acids like (9Z,12Z)-octadeca-9,12-dienoic acid, (9Z,12Z,15Z)-octadeca-9,12,15-trienoic acid, and (11Z,14Z)-icosa-11,14-dienoic acid. Toxic proteins such as calcium-activated proteinase and metalloproteinase jararhagin-like precursor were also identified by using a proteomic approach, indicating the possible involvement of these enzymes in the pre-digestion of spiders' preys web-captured. Apparently, the mixture of fatty acids are relatively toxic to insects by topical application (LD50 64.3 +/- 7.6 ng mg(-1) honeybee), while the proteins alone present no topical effect; however, when injected into the prey-insects, these proteins presented a moderate toxicity (LD50 40.3 +/- 4.8 ng mg(-1) honeybee); the mixture of fatty acids and proteins is very toxic to the preys captured by the web droplets of the viscid spiral of Nephila clavipes when topically applied on them (LD50 14.3 +/- 1.8ng mg(-1) honeybee).