939 resultados para On-line newspaper
Resumo:
Les fichiers qui accompagnent le document incluent une archive .jar du zoom-éditeur (qui peut être lancé via un browser) et des exemples de z-textes réalisés avec ce logiciel.
Resumo:
Pós-graduação em Educação - FFC
Resumo:
Amongst the opportunities for cross-cultural contact created by the burgeoning use of the Internet are those provided by electronic discussion lists. This study looks at what happens when language students venture out of the classroom (virtual or otherwise) to participate in on-line discussion groups with native speakers. Responses to messages and commentary by moderators and other participants on the (in) appropriateness of contributions allow us to determine what constitutes successful participation and to make suggestions regarding effective teaching strategies for this medium. A case study examines the threads started by four anglophone students of French when they post messages to a forum on the Web site of the French newspaper Le Monde. Investigation of these examples points to the ways in which electronic discussion inflects and is inflected by cultural and generic expectations. We suggest that successful participation on Internet fora depends on awareness of such cultural and generic mores and an ability to work within and/or with them. Teachers therefore need to find ways in which students can be sensitized to such issues so that their participation in such electronic discussion is no longer seen as linguistic training, but as engagement with a cultural practice.
Resumo:
Geographic Data Warehouses (GDW) are one of the main technologies used in decision-making processes and spatial analysis, and the literature proposes several conceptual and logical data models for GDW. However, little effort has been focused on studying how spatial data redundancy affects SOLAP (Spatial On-Line Analytical Processing) query performance over GDW. In this paper, we investigate this issue. Firstly, we compare redundant and non-redundant GDW schemas and conclude that redundancy is related to high performance losses. We also analyze the issue of indexing, aiming at improving SOLAP query performance on a redundant GDW. Comparisons of the SB-index approach, the star-join aided by R-tree and the star-join aided by GiST indicate that the SB-index significantly improves the elapsed time in query processing from 25% up to 99% with regard to SOLAP queries defined over the spatial predicates of intersection, enclosure and containment and applied to roll-up and drill-down operations. We also investigate the impact of the increase in data volume on the performance. The increase did not impair the performance of the SB-index, which highly improved the elapsed time in query processing. Performance tests also show that the SB-index is far more compact than the star-join, requiring only a small fraction of at most 0.20% of the volume. Moreover, we propose a specific enhancement of the SB-index to deal with spatial data redundancy. This enhancement improved performance from 80 to 91% for redundant GDW schemas.
Resumo:
Glyceraldehyde-3-phosphate dehydrogenase (GAPDH) plays an important role in the life cycle of the Trypanosoma cruzi, and an immobilized enzyme reactor (IMER) has been developed for use in the on-line screening for GAPDH inhibitors. An IMER containing human GAPDH has been previously reported; however, these conditions produced a T. cruzi GAPDH-IMER with poor activity and stability. The factors affecting the stability of the human and T. cruzi GAPDHs in the immobilization process and the influence of pH and buffer type on the stability and activity of the IMERs have been investigated. The resulting T. cruzi GAPDH-IMER was coupled to an analytical octyl column, which was used to achieve chromatographic separation of NAD+ from NADH. The production of NADH stimulated by D-glyceraldehyde-3-phosphate was used to investigate the activity and kinetic parameters of the immobilized T. cruzi GAPDH. The Michaelis-Menten constant (K-m) values determined for D-glyceraldehyde-3-phosphate and NAD(+) were K-m = 0.5 +/- 0.05 mM and 0.648 +/- 0.08 mM, respectively, which were consistent with the values obtained using the non-immobilized enzyme.
Resumo:
We introduce the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS). CATT-BRAMS is an on-line transport model fully consistent with the simulated atmospheric dynamics. Emission sources from biomass burning and urban-industrial-vehicular activities for trace gases and from biomass burning aerosol particles are obtained from several published datasets and remote sensing information. The tracer and aerosol mass concentration prognostics include the effects of sub-grid scale turbulence in the planetary boundary layer, convective transport by shallow and deep moist convection, wet and dry deposition, and plume rise associated with vegetation fires in addition to the grid scale transport. The radiation parameterization takes into account the interaction between the simulated biomass burning aerosol particles and short and long wave radiation. The atmospheric model BRAMS is based on the Regional Atmospheric Modeling System (RAMS), with several improvements associated with cumulus convection representation, soil moisture initialization and surface scheme tuned for the tropics, among others. In this paper the CATT-BRAMS model is used to simulate carbon monoxide and particulate material (PM(2.5)) surface fluxes and atmospheric transport during the 2002 LBA field campaigns, conducted during the transition from the dry to wet season in the southwest Amazon Basin. Model evaluation is addressed with comparisons between model results and near surface, radiosondes and airborne measurements performed during the field campaign, as well as remote sensing derived products. We show the matching of emissions strengths to observed carbon monoxide in the LBA campaign. A relatively good comparison to the MOPITT data, in spite of the fact that MOPITT a priori assumptions imply several difficulties, is also obtained.
Resumo:
An (n, d)-expander is a graph G = (V, E) such that for every X subset of V with vertical bar X vertical bar <= 2n - 2 we have vertical bar Gamma(G)(X) vertical bar >= (d + 1) vertical bar X vertical bar. A tree T is small if it has at most n vertices and has maximum degree at most d. Friedman and Pippenger (1987) proved that any ( n; d)- expander contains every small tree. However, their elegant proof does not seem to yield an efficient algorithm for obtaining the tree. In this paper, we give an alternative result that does admit a polynomial time algorithm for finding the immersion of any small tree in subgraphs G of (N, D, lambda)-graphs Lambda, as long as G contains a positive fraction of the edges of Lambda and lambda/D is small enough. In several applications of the Friedman-Pippenger theorem, including the ones in the original paper of those authors, the (n, d)-expander G is a subgraph of an (N, D, lambda)-graph as above. Therefore, our result suffices to provide efficient algorithms for such previously non-constructive applications. As an example, we discuss a recent result of Alon, Krivelevich, and Sudakov (2007) concerning embedding nearly spanning bounded degree trees, the proof of which makes use of the Friedman-Pippenger theorem. We shall also show a construction inspired on Wigderson-Zuckerman expander graphs for which any sufficiently dense subgraph contains all trees of sizes and maximum degrees achieving essentially optimal parameters. Our algorithmic approach is based on a reduction of the tree embedding problem to a certain on-line matching problem for bipartite graphs, solved by Aggarwal et al. (1996).
Resumo:
This work deals with neural network (NN)-based gait pattern adaptation algorithms for an active lower-limb orthosis. Stable trajectories with different walking speeds are generated during an optimization process considering the zero-moment point (ZMP) criterion and the inverse dynamic of the orthosis-patient model. Additionally, a set of NNs is used to decrease the time-consuming analytical computation of the model and ZMP. The first NN approximates the inverse dynamics including the ZMP computation, while the second NN works in the optimization procedure, giving an adapted desired trajectory according to orthosis-patient interaction. This trajectory adaptation is added directly to the trajectory generator, also reproduced by a set of NNs. With this strategy, it is possible to adapt the trajectory during the walking cycle in an on-line procedure, instead of changing the trajectory parameter after each step. The dynamic model of the actual exoskeleton, with interaction forces included, is used to generate simulation results. Also, an experimental test is performed with an active ankle-foot orthosis, where the dynamic variables of this joint are replaced in the simulator by actual values provided by the device. It is shown that the final adapted trajectory follows the patient intention of increasing the walking speed, so changing the gait pattern. (C) Koninklijke Brill NV, Leiden, 2011
Resumo:
Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).
Resumo:
To determine the effect of slurry rheology on industrial grinding performance, 45 surveys were conducted on 16 full-scale grinding mills in five sites. Four operating variables - mill throughput, slurry density, slurry viscosity and feed fines content-were investigated. The rheology of the mill discharge slurries was measured either on-line or off-line, and the data were processed using a standard procedure to obtain a full range of flow curves. Multi-linear regression was employed as a statistical analysis tool to determine whether or not rheological effects exert an influence on industrial grinding, and to assess the influence of the four mill operating conditions on mill performance in terms of the Grinding Index, a criterion describing the overall breakage of particles across the mill. The results show that slurry rheology does influence industrial grinding. The trends of these effects on Grinding Index depend upon the rheological nature of the slurry-whether the slurries are dilatant or pseudoplastic, and whether they exhibit a high or low yield stress. The interpretation of the regression results is discussed, the observed effects are summarised, and the potential for incorporating rheological principles into process control is considered, Guidelines are established to improve industrial grinding operations based on knowledge of the rheological effects. This study confirms some trends in the effect of slurry rheology on grinding reported in the literature, and extends these to a broader understanding of the relationship between slurry properties and rheology, and their effects on industrial milling performance. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Ten surveys of the ball milling circuit at the Mt Isa Mines (MIM) Copper Concentrator were conducted aiming to identify any changes in slurry theology caused by the use of chrome balls charge, and the associated effect on grinding performance. Slurry theology was measured using an on-line viscometer. The data were mass balanced and analysed with statistical tools. Comparison of the rheogram demonstrated that slurry density and fines content affected slurry rheology significantly, while the effect of the chrome ball charge being negligible. Statistical analysis showed the effects of mill throughput and cyclone efficiency on the Grinding Index (a term describing the overall breakage). There was no difference in the Grinding Index between using the chrome ball charge and the ordinary steel ball charge. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The increasing availability of mobility data and the awareness of its importance and value have been motivating many researchers to the development of models and tools for analyzing movement data. This paper presents a brief survey of significant research works about modeling, processing and visualization of data about moving objects. We identified some key research fields that will provide better features for online analysis of movement data. As result of the literature review, we suggest a generic multi-layer architecture for the development of an online analysis processing software tool, which will be used for the definition of the future work of our team.
Resumo:
Artigo traduzido para mandarim, publicado em Nature and Human Life E-Academic Magazine, 6 (2015), pp. 19-32. http://www.ziranyurensheng.org/current-2961621002.html.
Resumo:
use of additives (Mg/P and nitrification inhibitor dicyandiamide - DCD), on nitrous oxide emission during swine slurry composting. The experiment was run in duplicate; the gas was monitored for 30 days in different treatments (control, DCD, Mg/P and DCD + Mg/P). Nitrous oxide emissions rate (mg of N2O-N.day-1) and the accumulated emissions were calculated to compare the treatments. Results has shown that emissions of N-N2O were reduced by approximately 70, 46 and 96% through the additions of DCD, MgCl2.6H2O + H3PO4 and both additives, respectively, compared to the control. Keywords Composting; swine slurry; additives; nitrous
Resumo:
The report addresses the question of what are the preferences of broadband consumers on the Portuguese telecommunication market. A triple play bundle is being investigated. The discrete choice analysis, adopted in the study, base on 110 responses, mainly from NOVA students. The data for the analysis was collected via manually designed on-line survey. The results show that the price attribute is relatively the most important one while the television attribute is being overlooked in the decision making process. Main effects examined in the research are robust. In addition, "extras" components are being tested in terms of users' preferences.