805 resultados para Translation technologies
An electronic lifeline: Information and communication technologies in a teacher education internship
Resumo:
Four emerging high-energy non-thermal technologies may replace or augment heating for producing sterile low-acid food products. High pressure, high-voltage pulsed electric field, high-energy ultrasound and high-intensity pulsed light are all capable of reducing bacterial spore counts under certain conditions. However, only non-continuous high pressure treatments, at temperatures higher than ambient, are currently capable of completely inactivating spores and producing sterile food products. The first three technologies also reduce the resistance of spores to inactivation by heat.
Resumo:
The paper explores the development of learning behaviours in a virtual management course and the factors that impacted on this development. Data suggest that most teams experienced three kinds of learning behaviours – social, operational and content learning. We propose that the need for technical expertise and team participation will vary during these different stages of learning. Addressing the characteristics of these stages, we comment on the development of a ‘completion phase’ of team development. We argue that the extent to which teams demonstrate different learning stages has a significant impact on the development of on-line learning behaviours. Discussing these results, we suggest why different teams develop distinct learning behaviours, with accordant emphasis on teaching as a moderating and co ordinating role, despite current virtual team pedagogical expectations.
Resumo:
In an overview of some of the central issues concerning the impact and effects of new technology in adolescence, this article questions the reality of the net generation before considering the interplay of new and old technologies, the internet as both communication and lifestyle resource, and newer technologies like text messaging and webcams.
Resumo:
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.
Resumo:
One of the most important advantages of database systems is that the underlying mathematics is rich enough to specify very complex operations with a small number of statements in the database language. This research covers an aspect of biological informatics that is the marriage of information technology and biology, involving the study of real-world phenomena using virtual plants derived from L-systems simulation. L-systems were introduced by Aristid Lindenmayer as a mathematical model of multicellular organisms. Not much consideration has been given to the problem of persistent storage for these simulations. Current procedures for querying data generated by L-systems for scientific experiments, simulations and measurements are also inadequate. To address these problems the research in this paper presents a generic process for data-modeling tools (L-DBM) between L-systems and database systems. This paper shows how L-system productions can be generically and automatically represented in database schemas and how a database can be populated from the L-system strings. This paper further describes the idea of pre-computing recursive structures in the data into derived attributes using compiler generation. A method to allow a correspondence between biologists' terms and compiler-generated terms in a biologist computing environment is supplied. Once the L-DBM gets any specific L-systems productions and its declarations, it can generate the specific schema for both simple correspondence terminology and also complex recursive structure data attributes and relationships.
Resumo:
A self-modulating mechanism by the hepatitis C virus (HCV) core protein has been suggested to influence the level of HCV replication, but current data on this subject are contradictory. We examined the effect of wild-type and mutated core protein on HCV IRES- and cap-dependent translation. The wild-type core protein was shown to inhibit both IRES- and cap-dependent translation in an in vitro system. This effect was duplicated in a dose-dependent manner with a synthetic peptide representing amino acids 1-20 of the HCV core protein. This peptide was able to bind to the HCV IRES as shown by a mobility shift assay. In contrast, a peptide derived from the hepatitis B virus (HBV) core protein that contained a similar proportion of basic residues was unable to inhibit translation or bind the HCV IRES. A recombinant vaccinia-HCV core virus was used to examine the effect of the HCV core protein on HCV IRES-dependent translation in cells and this was compared with the effects of an HBV core-recombinant vaccinia virus. In CV-1 and HuH7 cells, the HCV core protein inhibited translation directed by the IRES elements of HCV, encephalomyocarditis virus and classical swine fever virus as well as cap-dependent translation, whereas in HepG2 cells, only HCV IRES-dependent translation was affected. Thus, the ability of the HCV core protein to selectively inhibit HCV IRES-dependent translation is cell-specific. N-terminal truncated (aa 1-20) HCV core protein that was expressed from a novel recombinant vaccinia virus in cells abrogated the inhibitory phenotype of the core protein in vivo, consistent with the above in vitro data.
Resumo:
Novel nonthermal processes, such as high hydrostatic pressure (HHP), pulsed electric fields (PEFs), ionizing radiation and ultrasonication, are able to inactivate microorganisms at ambient or sublethal temperatures. Many of these processes require very high treatment intensities, however, to achieve adequate microbial destruction in low-acid foods. Combining nonthermal processes with conventional preservation methods enhances their antimicrobial effect so that lower process intensities can be used. Combining two or more nonthermal processes can also enhance microbial inactivation and allow the use of lower individual treatment intensities. For conventional preservation treatments, optimal microbial control is achieved through the hurdle concept, with synergistic effects resulting from different components of the microbial cell being targeted simultaneously. The mechanisms of inactivation by nonthermal processes are still unclear; thus, the bases of synergistic combinations remain speculative. This paper reviews literature on the antimicrobial efficiencies of nonthermal processes combined with conventional and novel nonthermal technologies. Where possible, the proposed mechanisms of synergy is mentioned. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
With the purpose of at lowering costs and reendering the demanded information available to users with no access to the internet, service companies have adopted automated interaction technologies in their call centers, which may or may not meet the expectations of users. Based on different areas of knowledge (man-machine interaction, consumer behavior and use of IT) 13 propositions are raised and a research is carried out in three parts: focus group, field study with users and interviews with experts. Eleven automated service characteristics which support the explanation for user satisfaction are listed, a preferences model is proposed and evidence in favor or against each of the 13 propositions is brought in. With balance scorecard concepts, a managerial assessment model is proposed for the use of automated call center technology. In future works, the propositions may become verifiable hypotheses through conclusive empirical research.