946 resultados para 2.0 Web
Resumo:
Доклад, поместен в сборника на Националната конференция "Образованието в информационното общество", Пловдив, май 2011 г.
Resumo:
Our modular approach to data hiding is an innovative concept in the data hiding research field. It enables the creation of modular digital watermarking methods that have extendable features and are designed for use in web applications. The methods consist of two types of modules – a basic module and an application-specific module. The basic module mainly provides features which are connected with the specific image format. As JPEG is a preferred image format on the Internet, we have put a focus on the achievement of a robust and error-free embedding and retrieval of the embedded data in JPEG images. The application-specific modules are adaptable to user requirements in the concrete web application. The experimental results of the modular data watermarking are very promising. They indicate excellent image quality, satisfactory size of the embedded data and perfect robustness against JPEG transformations with prespecified compression ratios. ACM Computing Classification System (1998): C.2.0.
Resumo:
Internet and the Web have changed the way that companies communicate with their publics, improving relations between them. Also providing substantial benefits for organizations. This has led to small and medium enterprises (SMEs) to develop corporate sites to establish relationships with their audiences. This paper, applying the methodology of content analysis, analyzes the main factors and tools that make the Websites usable and intuitive sites that promote better relations between SMEs and their audiences. Also, it has developed an index to measure the effectiveness of Webs from the perspective of usability. The results indicate that the Websites have, in general, appropriate levels of usability.
Resumo:
Gracias al crecimiento, expansión y popularización de la World Wide Web, su desarrollo tecnológico tiene una creciente importancia en la sociedad. La simbiosis que protagonizan estos dos entornos ha propiciado una mayor influencia social en las innovaciones de la plataforma y un enfoque mucho más práctico. Nuestro objetivo en este artículo es describir, caracterizar y analizar el surgimiento y difusión del nuevo estándar de hipertexto que rige la Web; HTML5. Al mismo tiempo exploramos este proceso a la luz de varias teorías que aúnan tecnología y sociedad. Dedicamos especial atención a los usuarios de la World Wide Web y al uso genérico que realizan de los Medios Sociales o "Social Media". Sugerimos que el desarrollo de los estándares web está influenciado por el uso cotidiano de este nuevo tipo de tecnologías y aplicaciones.
Resumo:
Com o intuito de melhorar a eficiência na gestão e execução de responsabilidades junto dos munícipes, a Câmara Municipal de Angra do Heroísmo (CMAH), localizada na ilha Terceira (Região Autónoma dos Açores), distribui as suas valências por vários departamentos e colaboradores especializados. Apesar desta segmentação existem circunstâncias em que os mesmos trabalham em conjunto e cruzam informações, por exemplo, nos processos de licenciamento. Contudo, esta necessária troca de dados é deficiente quando se calendarizam eventos organizados ou não pela instituição em causa. Consequentemente, esta falha resulta muitas vezes na sobreposição de eventos, algo considerado insustentável numa comunidade relativamente pequena, como é o caso de Angra do Heroísmo (em 2013, contava com 35.109 habitantes). A autarquia pretende solucionar o problema tendo em conta as capacidades proporcionadas pelas plataformas da Web 2.0 que, entre outras, permitem a participação dos utilizadores e a fácil inserção e gestão da informação por pessoas sem conhecimentos técnicos aprofundados. Esta dissertação determina as especificações que devem estar presentes numa plataforma Web de calendarização e divulgação da oferta cultural, ao serviço do Município de Angra do Heroísmo; conceptualiza um protótipo funcional que valida as especificações identificadas e serve de apoio à construção da plataforma final a desenvolver no futuro. Esta investigação tem como fim melhorar o processo de calendarização e divulgação de eventos da oferta cultural do concelho angrense. Esta finalidade implicou a necessidade de conhecer aprofundadamente o funcionamento da instituição, identificando e distinguindo o papel dos vários intervenientes e processos, pelo que parte da investigação decorreu na Câmara Municipal de Angra do Heroísmo. Entre os vários desafios desta pesquisa destacam-se a recolha e compreensão de informação sobre o processo em estudo e o planeamento de um sistema digital intuitivo, que respeite as estruturas de decisão e o sistema hierárquico da autarquia e que detenha o grau de rigor exigido nas organizações governativas.
Resumo:
Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.
Resumo:
Las transformaciones tecnológicas y de información que está experimentando la sociedad, especialmente en la última década, está produciendo un crecimiento exponencial de los datos en todos los ámbitos de la sociedad. Los datos que se generan en los diferentes ámbitos se corresponden con elementos primarios de información que por sí solos son irrelevantes como apoyo a las tomas de decisiones. Para que estos datos puedan ser de utilidad en cualquier proceso de decisión, es preciso que se conviertan en información, es decir, en un conjunto de datos procesados con un significado, para ayudar a crear conocimiento. Estos procesos de transformación de datos en información se componen de diferentes fases como la localización de las fuentes de información, captura, análisis y medición.Este cambio tecnológico y a su vez de la sociedad ha provocado un aumento de las fuentes de información, de manera que cualquier persona, empresas u organización, puede generar información que puede ser relevante para el negocio de las empresas o gobiernos. Localizar estas fuentes, identificar información relevante en la fuente y almacenar la información que generan, la cual puede tener diferentes formatos, es el primer paso de todo el proceso anteriormente descrito, el cual tiene que ser ejecutado de manera correcta ya que el resto de fases dependen de las fuentes y datos recolectados. Para la identificación de información relevante en las fuentes se han creado lo que se denomina, robot de búsqueda, los cuales examinan de manera automática una fuente de información, localizando y recolectando datos que puedan ser de interés.En este trabajo se diseña e implementa un robot de conocimiento junto con los sistemas de captura de información online para fuentes hipertextuales y redes sociales.
Resumo:
A aprendizagem em rede e as potencialidades do software social trouxeram novos e estimulantes desafios para os sistemas educativos e para os seus profissionais. Um dos principais desafios prende-se com a necessidade de conceber uma “nova” didática para a docência na web social que deve basear-se não só nos conhecimentos científico, tecnológico, curricular e pedagógico, mas também num conhecimento científico e pedagógico da tecnologia que permita planear, conceber e utilizar as redes sociais, como o Facebook, no processo de ensino-aprendizagem de forma eficaz. Assim, e perante esta realidade, neste estudo exploratório procurámos perceber em que medida a utilização do Facebook no processo de ensino-aprendizagem permite a promoção de competências de aprendizagem de estudantes de cursos pós-graduados, a nível da capacidade para aprender, da iniciativa e da autonomia. Os resultados sugerem que os estudantes aceitam o Facebook como um novo contexto para a aprendizagem, que não impede a reflexão crítica sobre os conceitos e as temáticas apresentadas para trabalho, possibilitando o desenvolvimento de comunidades de aprendizagem desde que exista uma intencionalidade educativa explícita.
Resumo:
Benefitting from Web 2.0 features, Social Media allows organisations to be where the users are, creating proximity, talking to them, and knowing what they want. Going viral and word-of-mouth become easier, as these platforms allow us to share, to like, and to use multimedia and convergence – as they can interact with each other, communicating on a large scale. Given that online portals provide for a highly competitive environment, players strive to get more visits, better search rankings, and even aspire to be the homepage for the Web universe. We discuss the integration of Social Media tools in a Web Portal, and explore how using these together may improve the competitiveness of a Web Portal. A large Web Portal was selected to develop this case study. We found that, although for this particular Web Portal conditions were created to accommodate and integrate the chosen Social Media platforms, this was done in an organic and fluid way, with great focus on community construction and less focus on absorptive capacity. Based on the findings of this case study, we propose a dynamic cycle of benefits for integrating Social Media tools in a Web Portal.
Resumo:
This paper presents a web based expert system application that carries out an initial assessment of the feasibility of a web project. The system allows detection of inconsistency problems before design starts, and suggests correcting actions to solve them. The developed system presents important advantages not only for determining the feasibility of a web project but also by acting as a means of communication between the client company and the web development team, making the requirements specification clearer.
Resumo:
Accelerated stability tests are indicated to assess, within a short time, the degree of chemical degradation that may affect an active substance, either alone or in a formula, under normal storage conditions. This method is based on increased stress conditions to accelerate the rate of chemical degradation. Based on the equation of the straight line obtained as a function of the reaction order (at 50 and 70 ºC) and using Arrhenius equation, the speed of the reaction was calculated for the temperature of 20 ºC (normal storage conditions). This model of accelerated stability test makes it possible to predict the chemical stability of any active substance at any given moment, as long as the method to quantify the chemical substance is available. As an example of the applicability of Arrhenius equation in accelerated stability tests, a 2.5% sodium hypochlorite solution was analyzed due to its chemical instability. Iodometric titration was used to quantify free residual chlorine in the solutions. Based on data obtained keeping this solution at 50 and 70 ºC, using Arrhenius equation and considering 2.0% of free residual chlorine as the minimum acceptable threshold, the shelf-life was equal to 166 days at 20 ºC. This model, however, makes it possible to calculate shelf-life at any other given temperature.
Resumo:
Collagen XVIII can generate two fragments, NC11-728 containing a frizzled motif which possibly acts in Wnt signaling and Endostatin, which is cleaved from the NC1 and is a potent inhibitor of angiogenesis. Collagen XVIII and Wnt signaling have recently been associated with adipogenic differentiation and obesity in some animal models, but not in humans. In the present report, we have shown that COL18A1 expression increases during human adipogenic differentiation. We also tested if polymorphisms in the Frizzled (c.1136C>T; Thr379Met) and Endostatin (c.4349G>A; Asp1437Asn) regions contribute towards susceptibility to obesity in patients with type 2 diabetes (113 obese, BMI =30; 232 non-obese, BMI < 30) of European ancestry. No evidence of association was observed between the allele c.4349G>A and obesity, but we observed a significantly higher frequency of homozygotes c.1136TT in obese (19.5%) than in non-obese individuals (10.9%) [P = 0.02; OR = 2.0 (95%CI: 1.07-3.73)], suggesting that the allele c.1136T is associated to obesity in a recessive model. This genotype, after controlling for cholesterol, LDL cholesterol, and triglycerides, was independently associated with obesity (P = 0.048), and increases the chance of obesity in 2.8 times. Therefore, our data suggest the involvement of collagen XVIII in human adipogenesis and susceptibility to obesity.
Resumo:
Solid-phase microextraction, using on-line bis(trimethylsilyl)trifluoroacetamide derivatisation, gas chromatography, and mass spectrometry, was evaluated in the quantification of 3-chloro-4-(dichloromethyl)-5-hydroxy-2(5H)-furanone (MX) in water samples. Fibres encompassing a wide range of polarities were used with headspace and direct immersion sampling. For the immersion procedure, various parameters affecting MX extraction, including pH, salinity, temperature, and extraction time were evaluated. The optimised method (polyacrylate fibre; 20% Na2SO4; pH 2.0; 60 min; 20 °C) was applied for reservoir chlorinated water samples-either natural or spiked with MX (50 ng L-1 and 100 ng L-1). The recovery of MX ranged from 44 to 72%. Quantification of MX in water samples was done using external standard and the selected ion monitoring mode. Correlation coefficient (0.98%), relative standard deviation (5%), limit of detection (30 ng L-1) and limit of quantification (50 ng L-1) were obtained from calibration curve.
Resumo:
Using a sample of 68.3x10(6) K(L)->pi(0)pi(0)pi(0) decays collected in 1996-1999 by the KTeV (E832) experiment at Fermilab, we present a detailed study of the K(L)->pi(0)pi(0)pi(0) Dalitz plot density. We report the first observation of interference from K(L)->pi(+)pi(-)pi(0) decays in which pi(+)pi(-) rescatters to pi(0)pi(0) in a final-state interaction. This rescattering effect is described by the Cabibbo-Isidori model, and it depends on the difference in pion scattering lengths between the isospin I=0 and I=2 states, a(0)-a(2). Using the Cabibbo-Isidori model, and fixing (a(0)-a(2))m(pi)(+)=0.268 +/- 0.017 as measured by the CERN-NA48 collaboration, we present the first measurement of the K(L)->pi(0)pi(0)pi(0) quadratic slope parameter that accounts for the rescattering effect: h(000)=(+0.59 +/- 0.20(stat)+/- 0.48(syst)+/- 1.06(ext))x10(-3), where the uncertainties are from data statistics, KTeV systematic errors, and external systematic errors. Fitting for both h(000) and a(0)-a(2), we find h(000)=(-2.09 +/- 0.62(stat)+/- 0.72(syst)+/- 0.28(ext))x10(-3), and m(pi)(+)(a(0)-a(2))=0.215 +/- 0.014(stat)+/- 0.025(syst)+/- 0.006(ext); our value for a(0)-a(2) is consistent with that from NA48.
Resumo:
We report on K*(0) production at midrapidity in Au + Au and Cu + Cu collisions at root s(NN) = 62.4 and 200 GeV collected by the Solenoid Tracker at the Relativistic Heavy Ion Collider detector. The K*(0) is reconstructed via the hadronic decays K*(0) -> K(+)pi(-) and (K*(0)) over bar -> K(+)pi(-). Transverse momentum, p(T), spectra are measured over a range of p(T) extending from 0.2 GeV/c up to 5 GeV/c. The center-of-mass energy and system size dependence of the rapidity density, dN/dy, and the average transverse momentum, < p(T)>, are presented. The measured N(K*(0))/N(K) and N(phi)/N(K*(0)) ratios favor the dominance of rescattering of decay daughters of K*(0) over the hadronic regeneration for the K*(0) production. In the intermediate p(T) region (2.0 < p(T) < 4.0 GeV/c), the elliptic flow parameter, v(2), and the nuclear modification factor, R(CP), agree with the expectations from the quark coalescence model of particle production.