912 resultados para e-learning 2.0


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Durante os anos agrícolas de 2002-2003 e 2003-2004 foram conduzidos trabalhos no município de Maringá - PR, com o objetivo de avaliar o dano potencial de subdoses de 2,4-D sobre plantas de uva, imitando depósitos decorrentes de deriva. No primeiro experimento, a aplicação foi realizada cerca de 30 dias após a poda de inverno, num pomar de uva Itália. As doses utilizadas foram de 6,72; 13,44; 26,88; 53,76 e 107,52 g de equivalente ácido (e.a.) por hectare de 2,4-D, equivalentes a depósitos de 1,0%; 2,0%; 4,0%; 8,0% e 16,0%, assumindo-se uma aplicação de 1 L ha-1 (670 g e.a. ha-1). Nessa data, as plantas encontravam-se na fase de emissão de cachos e florescimento (estádio 15). O surgimento de sintomas visuais de fitointoxicação foi imediato e proporcional às doses aplicadas. A produtividade da cultura foi afetada por todas as doses aplicadas nesse estádio de crescimento. No entanto, mesmo com as injúrias severas registradas na dose mais alta, as plantas afetadas se recuperaram após duas podas para as condições de manejo regionais (duas safras por ano). No segundo experimento, foram aplicadas doses equivalentes a derivas de 1,0 e 2,0% (6,72 e 13,44 g e.a. ha-1) em três estádios do ciclo de desenvolvimento. A aplicação de doses < 13,44 g e.a. ha-1 (2,0% de deriva simulada) a partir do estádio de "meia-baga", não causou repercussões negativas em termos de injúrias visuais e produtividade.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To determine the effects of combined therapy of gliclazide and bedtime insulin on glycemic control and C-peptide secretion, we studied 25 patients with type 2 diabetes and sulfonylurea secondary failure, aged 56.8 ± 8.3 years, with a duration of diabetes of 10.6 ± 6.6 years, fasting plasma glucose of 277.3 ± 64.6 mg/dl and a body mass index of 27.4 ± 4.8 kg/m². Patients were submitted to three therapeutic regimens lasting 2 months each: 320 mg gliclazide (phase 1), 320 mg gliclazide and bedtime NPH insulin (phase 2), and insulin (phase 3). At the end of each period, glycemic and C-peptide curves in response to a mixed meal were determined. During combined therapy, there was a decrease in all glycemic curve values (P<0.01). Twelve patients (48%) reached fasting plasma glucose <140 mg/dl with a significant weight gain of 64.8 kg (43.1-98.8) vs 66.7 kg (42.8-101.4) (P<0.05), with no increase in C-peptide secretion or decrease in HbA1. C-Peptide glucose score (C-peptide/glucose x 100) increased from 0.9 (0.2-2.1) to 1.3 (0.2-4.7) during combined therapy (P<0.01). Despite a 50% increase in insulin doses in phase 3 (12 U (9-30) vs 18 U (11-60); P<0.01) only 3 patients who responded to combined therapy maintained fasting plasma glucose <140 mg/dl (P<0.02). A tendency to a higher absolute increase in C-peptide (0.99 (0.15-2.5) vs 0.6 (0-2.15); P = 0.08) and C-peptide incremental area (2.47 (0.22-6.2) vs 1.2 (0-3.35); P = 0.07) was observed among responders. We conclude that combined therapy resulted in a better glucose response to a mixed meal than insulin alone and should be tried in type 2 diabetic patients before starting insulin monotherapy, despite difficulties in predicting the response.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hypoxia activates endothelial cells by the action of reactive oxygen species generated in part by cyclooxygenases (COX) production enhancing leukocyte transmigration. We investigated the effect of specific COX inhibition on the function of endothelial cells exposed to hypoxia. Mouse immortalized endothelial cells were subjected to 30 min of oxygen deprivation by gas exchange. Acridine orange/ethidium bromide dyes and lactate dehydrogenase activity were used to monitor cell viability. The mRNA of COX-1 and -2 was amplified and semi-quantified before and after hypoxia in cells treated or not with indomethacin, a non-selective COX inhibitor. Expression of RANTES (regulated upon activation, normal T cell expressed and secreted) protein and the protective role of heme oxygenase-1 (HO-1) were also investigated by PCR. Gas exchange decreased partial oxygen pressure (PaO2) by 45.12 ± 5.85% (from 162 ± 10 to 73 ± 7.4 mmHg). Thirty minutes of hypoxia decreased cell viability and enhanced lactate dehydrogenase levels compared to control (73.1 ± 2.7 vs 91.2 ± 0.9%, P < 0.02; 35.96 ± 11.64 vs 22.19 ± 9.65%, P = 0.002, respectively). COX-2 and HO-1 mRNA were up-regulated after hypoxia. Indomethacin (300 µM) decreased COX-2, HO-1, hypoxia-inducible factor-1alpha and RANTES mRNA and increased cell viability after hypoxia. We conclude that blockade of COX up-regulation can ameliorate endothelial injury, resulting in reduced production of chemokines.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

(E)-2-nonenal is considered an important off-flavor of beer, related to the flavor of beer staling. In this study, a new method for determination of (E)-2-nonenal in beer using headspace solid-phase microextraction and gas chromatographic coupled mass spectrometry (HS-SPME-GC-MS) was developed and applied in Brazilian beer samples. The extractions were carried out in CAR-PDMS (carboxen-polydimethylsiloxane) fiber and the best results were found with 15 minutes of equilibrium and 90 minutes of extraction at 50 °C. The method was linear in the range from 0.02 to 4.0 μg.L-1 with correlation coefficient of 0.9994. The limits of detection and quantification were 0.01 and 0.02 μg.L-1, respectively. 96.5% of recovery and 4% precision (RSD) were obtained in the fortification of beer samples with 2.0 μg.L-1 of (E)-2-nonenal. The developed method proved to be simple, efficient and highly sensitive to the determination of this analyte being easily applied in the quality control of the brewery. (E)-2-nonenal was found in all beer samples analyzed with levels between 0.17 and 0.42 μg.L-1.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Low levels of ionizing radiation induce two translocation responses in soybean: a reduction in photoassimilate export from leaves and a change in the distribution pattern of exported photoassimilate within the plant. In this investigation these responses have been further studied specifically to ascertain the site of radiation damage and to better understand the physiological responses observed. Experimentally the primary data was obtained from studies in which a mature trifoliate leaf of a young soybean plant (Glycine ~ L. cultivar Harosoy '63) is isolated in a closed transparent chamber and allowed to photoassimilate 14C02 for 15 minutes. This is followed by an additional 45 ~_il'1;ute period before the plant is sectl.o ne d an d 14 C-ra dl' oactl.v.l ty d eterml. ne d'l n a 11 parts. Such 14c data provides one with the magnitude and distribution pattern of translocation. Further analyses were conducted to determine the relative levels of the major photosynthetic products using the techniques of paper chromatography and autoradiography. Since differences between control and irradiated P 1 ants were not 0 b serve d l' n t h e par tl't"lo nlng 0 f 14 C between the 80% ethanol-soluble and -insoluble fractions 14 or in the relative amounts of C-products of photosynthesis, the reduction in export in irradiated plants is not likely due to reduced availability of translocatable materials. Data presented in this thesis shows that photoassimilate export was not affected by gamma radiation until a threshold dose between 2.0 and 3.0 krads was reached. It was also observed that radiation-induced damage to the export process was capable of recovery in a period of 1 to 2 hours provided high light intensity was supplied. In contrast, the distribution pattern was shown to be extremely radiosensitive with a low threshold dose between .25 and .49 krads. Although this process was also capable of recovery,lt" occurred much earlier and was followed by a secondary effect which lasted at least for the duration of the experiments. The data presented in this thesis is interpreted to suggest that the sites of radiation action for the two translocation responses are different. In regards to photoassimilate export, the site of action of ionizing radiation is the leaf, quite possibly the process of photophosphorylation which may provide energy directly for phloem loading and for membrane integrity of the phloem tissue* In regards to the pattern of distribution of exported photoassimilate, the site is likely the apical sink, possibly the result of changes of levels of endogenous hormones. By the selection of radiation exposure dose and time post-irradiation, it is possible to affect independently these two processes suggesting that each may be regulated independent of the other and involves a distinct site.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Affiliation: Claudia Kleinman, Nicolas Rodrigue & Hervé Philippe : Département de biochimie, Faculté de médecine, Université de Montréal

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La bibliographie comprends des adresses web consultables en ligne

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today higher education system and R&D in science & Technology has undergone tremendous changes from the traditional class room learning system and scholarly communication. Huge volume of Academic output and scientific communications are coming in electronic format. Knowledge management is a key challenge in the current century .Due to the advancement of ICT, Open access movement, Scholarly communications, Institutional repositories, ontology, semantic web, web 2.0 etc has revolutionized knowledge transactions and knowledge management in the field of science & technology. Today higher education has moved into a stage where competitive advantage is gained not just through access of infonnation but more importantly from new Knowledge creations.This paper examines the role of institutional repository in knowledge transactions in current scenario of Higher education.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Page 1. Towards Web 3.0... • Web 1.0 • Web 2.0 • Web 3.0 • Technology hype? • Internet as seen by our kids? Page 2. Random Trivia: Brazil has more Orkut users than citizens Page 3. The war is over. Platforms have won. Applications have lost Page 4. Page 5. Blogosphere • The blogosphere is made up of all blogs and their interconnections Page 6. Social bookmarking Page 7. Page 8. Page 9. Page 10. Page 11. Page 12. Page 13. Page 14. Towards Web 3.0 Page 15. Page 16. Wolfram Alpha Page 17. Page 18. Page 19. Page 20. Page 21. Page 22

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multiconfiguration relativistic Dirac-Fock (MCDF) values have been computed for the first four ionization potentials (IPs) of element 104 (unnilquadium) and of the other group 4 elements (Ti, Zr, and Hf). Factors were calculated that allowed correction of the systematic errors between the MCDF IPs and the experimental IPs. Single "experimental" IPs evaluated in eV (to ± 0.1 eV) for element 104 are: [104(0),6.5]; [104( 1 + ),14.8]; [104(2 + ),23.8]; [104(3 + ),31.9]. Multiple experimental IPs evaluated in eV for element 104 are: [(0-2+ ),21.2±0.2]; [(0-3+ ),45.1 ±0.2]; [(0-4+ ),76.8±0.3].Our MCDF results track 11 of the 12 experimental single IPs studied for group 4 atoms and ions. The exception is Hf( 2 + ). We submit our calculated IP of 22.4 ± 0.2 eV as much more accurate than the value of 23.3 eV derived from experiment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ongoing growth of the World Wide Web, catalyzed by the increasing possibility of ubiquitous access via a variety of devices, continues to strengthen its role as our prevalent information and commmunication medium. However, although tools like search engines facilitate retrieval, the task of finally making sense of Web content is still often left to human interpretation. The vision of supporting both humans and machines in such knowledge-based activities led to the development of different systems which allow to structure Web resources by metadata annotations. Interestingly, two major approaches which gained a considerable amount of attention are addressing the problem from nearly opposite directions: On the one hand, the idea of the Semantic Web suggests to formalize the knowledge within a particular domain by means of the "top-down" approach of defining ontologies. On the other hand, Social Annotation Systems as part of the so-called Web 2.0 movement implement a "bottom-up" style of categorization using arbitrary keywords. Experience as well as research in the characteristics of both systems has shown that their strengths and weaknesses seem to be inverse: While Social Annotation suffers from problems like, e. g., ambiguity or lack or precision, ontologies were especially designed to eliminate those. On the contrary, the latter suffer from a knowledge acquisition bottleneck, which is successfully overcome by the large user populations of Social Annotation Systems. Instead of being regarded as competing paradigms, the obvious potential synergies from a combination of both motivated approaches to "bridge the gap" between them. These were fostered by the evidence of emergent semantics, i. e., the self-organized evolution of implicit conceptual structures, within Social Annotation data. While several techniques to exploit the emergent patterns were proposed, a systematic analysis - especially regarding paradigms from the field of ontology learning - is still largely missing. This also includes a deeper understanding of the circumstances which affect the evolution processes. This work aims to address this gap by providing an in-depth study of methods and influencing factors to capture emergent semantics from Social Annotation Systems. We focus hereby on the acquisition of lexical semantics from the underlying networks of keywords, users and resources. Structured along different ontology learning tasks, we use a methodology of semantic grounding to characterize and evaluate the semantic relations captured by different methods. In all cases, our studies are based on datasets from several Social Annotation Systems. Specifically, we first analyze semantic relatedness among keywords, and identify measures which detect different notions of relatedness. These constitute the input of concept learning algorithms, which focus then on the discovery of synonymous and ambiguous keywords. Hereby, we assess the usefulness of various clustering techniques. As a prerequisite to induce hierarchical relationships, our next step is to study measures which quantify the level of generality of a particular keyword. We find that comparatively simple measures can approximate the generality information encoded in reference taxonomies. These insights are used to inform the final task, namely the creation of concept hierarchies. For this purpose, generality-based algorithms exhibit advantages compared to clustering approaches. In order to complement the identification of suitable methods to capture semantic structures, we analyze as a next step several factors which influence their emergence. Empirical evidence is provided that the amount of available data plays a crucial role for determining keyword meanings. From a different perspective, we examine pragmatic aspects by considering different annotation patterns among users. Based on a broad distinction between "categorizers" and "describers", we find that the latter produce more accurate results. This suggests a causal link between pragmatic and semantic aspects of keyword annotation. As a special kind of usage pattern, we then have a look at system abuse and spam. While observing a mixed picture, we suggest that an individual decision should be taken instead of disregarding spammers as a matter of principle. Finally, we discuss a set of applications which operationalize the results of our studies for enhancing both Social Annotation and semantic systems. These comprise on the one hand tools which foster the emergence of semantics, and on the one hand applications which exploit the socially induced relations to improve, e. g., searching, browsing, or user profiling facilities. In summary, the contributions of this work highlight viable methods and crucial aspects for designing enhanced knowledge-based services of a Social Semantic Web.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El presente Business Plan se ha diseñado con el ánimo de crear la empresa Virtual Gnosis. Dicha empresa operará en el sector de la informática y la industria de software, dentro de la prestación de Servicios de TI. La empresa se centrará en la gestión de la información y el conocimiento del talento humano de las organizaciones, a través de la creación de AVAI (Ambientes Virtuales de Aprendizaje Inmersivo). El mercado objetivo son las instituciones educativas y los conglomerados de PYMES. A continuación se exponen con detalle los aspectos de operaciones, marketing, financieros y administrativos de la empresa propuesta.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Web 2.0 is sometimes described as the read/write web, giving everyday users the chance to create and share information as well as to consume information created by others. Social media systems are built on this foundation of participation and sharing, but what is the mindset of these users, and are they quite so everyday as we might suppose? The skills and attitudes held by users can be described as their literacy, and there has been a lot of debate over the last few years about how to describe these literacies, and design for them. One field that has been changed radically by this notion is Technology Enhanced Learning (TEL) where a fierce debate has raged about the potential of a new generation of highly literate digital natives, and Edupunks have argued for open and personal systems that challenge traditional models of institutional control. In this session we look at the arguments surrounding digital literacy and examine TEL as an example of how social media can change an application domain.