989 resultados para Processing Technologies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of combining high-pressure processing (HPP) and active packaging technologies to control Listeria monocytogenes growth during the shelf life of artificially inoculated cooked ham was assessed. Three lots of cooked ham were prepared: control, packaging with alginate films, and packaging with antimicrobial alginate films containing enterocins. After packaging, half of the samples were pressurized. Sliced cooked ham stored at 6 °C experienced a quick growth of L. monocytogenes. Both antimicrobial packaging and pressurization delayed the growth of the pathogen. However, at 6 °C the combination of antimicrobial packaging and HPP was necessary to achieve a reduction of inoculated levels without recovery during 60 days of storage. Further storage at 6 °C of pressurized antimicrobial packed cooked ham resulted in L. monocytogenes levels below the detection limit (day 90). On the other hand, storage at 1 °C controlled the growth of the pathogen until day 39 in non-pressurized ham, while antimicrobial packaging and storage at 1 °C exerted a bacteriostatic effect for 60 days. All HPP lots stored at 1 °C led to counts <100 CFU/g at day 60. Similar results were observed when combining both technologies. After a cold chain break no growth of L. monocytogenes was observed in pressurized ham packed with antimicrobial films, showing the efficiency of combining both technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé: Le développement rapide de nouvelles technologies comme l'imagerie médicale a permis l'expansion des études sur les fonctions cérébrales. Le rôle principal des études fonctionnelles cérébrales est de comparer l'activation neuronale entre différents individus. Dans ce contexte, la variabilité anatomique de la taille et de la forme du cerveau pose un problème majeur. Les méthodes actuelles permettent les comparaisons interindividuelles par la normalisation des cerveaux en utilisant un cerveau standard. Les cerveaux standards les plus utilisés actuellement sont le cerveau de Talairach et le cerveau de l'Institut Neurologique de Montréal (MNI) (SPM99). Les méthodes de recalage qui utilisent le cerveau de Talairach, ou celui de MNI, ne sont pas suffisamment précises pour superposer les parties plus variables d'un cortex cérébral (p.ex., le néocortex ou la zone perisylvienne), ainsi que les régions qui ont une asymétrie très importante entre les deux hémisphères. Le but de ce projet est d'évaluer une nouvelle technique de traitement d'images basée sur le recalage non-rigide et utilisant les repères anatomiques. Tout d'abord, nous devons identifier et extraire les structures anatomiques (les repères anatomiques) dans le cerveau à déformer et celui de référence. La correspondance entre ces deux jeux de repères nous permet de déterminer en 3D la déformation appropriée. Pour les repères anatomiques, nous utilisons six points de contrôle qui sont situés : un sur le gyrus de Heschl, un sur la zone motrice de la main et le dernier sur la fissure sylvienne, bilatéralement. Evaluation de notre programme de recalage est accomplie sur les images d'IRM et d'IRMf de neuf sujets parmi dix-huit qui ont participés dans une étude précédente de Maeder et al. Le résultat sur les images anatomiques, IRM, montre le déplacement des repères anatomiques du cerveau à déformer à la position des repères anatomiques de cerveau de référence. La distance du cerveau à déformer par rapport au cerveau de référence diminue après le recalage. Le recalage des images fonctionnelles, IRMf, ne montre pas de variation significative. Le petit nombre de repères, six points de contrôle, n'est pas suffisant pour produire les modifications des cartes statistiques. Cette thèse ouvre la voie à une nouvelle technique de recalage du cortex cérébral dont la direction principale est le recalage de plusieurs points représentant un sillon cérébral. Abstract : The fast development of new technologies such as digital medical imaging brought to the expansion of brain functional studies. One of the methodolgical key issue in brain functional studies is to compare neuronal activation between individuals. In this context, the great variability of brain size and shape is a major problem. Current methods allow inter-individual comparisions by means of normalisation of subjects' brains in relation to a standard brain. A largerly used standard brains are the proportional grid of Talairach and Tournoux and the Montreal Neurological Insititute standard brain (SPM99). However, there is a lack of more precise methods for the superposition of more variable portions of the cerebral cortex (e.g, neocrotex and perisyvlian zone) and in brain regions highly asymmetric between the two cerebral hemipsheres (e.g. planum termporale). The aim of this thesis is to evaluate a new image processing technique based on non-linear model-based registration. Contrary to the intensity-based, model-based registration uses spatial and not intensitiy information to fit one image to another. We extract identifiable anatomical features (point landmarks) in both deforming and target images and by their correspondence we determine the appropriate deformation in 3D. As landmarks, we use six control points that are situated: one on the Heschl'y Gyrus, one on the motor hand area, and one on the sylvian fissure, bilaterally. The evaluation of this model-based approach is performed on MRI and fMRI images of nine of eighteen subjects participating in the Maeder et al. study. Results on anatomical, i.e. MRI, images, show the mouvement of the deforming brain control points to the location of the reference brain control points. The distance of the deforming brain to the reference brain is smallest after the registration compared to the distance before the registration. Registration of functional images, i.e fMRI, doesn't show a significant variation. The small number of registration landmarks, i.e. six, is obvious not sufficient to produce significant modification on the fMRI statistical maps. This thesis opens the way to a new computation technique for cortex registration in which the main directions will be improvement of the registation algorithm, using not only one point as landmark, but many points, representing one particular sulcus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of information and communication technologies in the health and social service sectors, and the development of multi-centred and international research networks present many benefits for society: for example, better follow-up on an individual’s states of health, better quality of care, better control of expenses, and better communication between healthcare professionals. However, this approach raises issues relative to the protection of privacy: more specifically, to the processing of individual health information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La présente recherche vise à mieux comprendre, dans le contexte universitaire béninois, s’il peut exister un lien qualitatif entre TIC et rendement académique afin de pouvoir mettre les TIC à contribution pour améliorer significativement les mauvais résultats des apprenants, notamment au premier cycle universitaire. Cette étude est tout particulièrement importante dans notre contexte où les TIC font de plus en plus leur apparition en pédagogie universitaire et où les étudiants recourent aux TIC dans leur pratique plus que les formateurs dans la leur. Le cadre de référence retenu pour la recherche est structuré autour des concepts de l’apprentissage assisté par les TIC, de motivation en éducation et de rendement académique. Pour atteindre notre objectif de recherche, nous avons opté pour une démarche mixte : quantitative et qualitative. Il s’agit d’une étude descriptive/explicative. Nous avons mené une enquête par questionnaires auprès de 156 étudiants et 15 enseignants et fait des entrevues avec 11 étudiants et 6 enseignants. Les principaux résultats sont présentés sous forme d’articles respectivement en ce qui a trait à l’impact des TIC sur la motivation et la réussite, aux usages des TIC les plus fréquemment rencontrés chez les apprenants, et à la place des TIC dans la pratique pédagogique des enseignants de la faculté de droit de l’Université d’Abomey-Calavi. Plus précisément, il ressort des résultats que la majorité des participants ont une perception en général positive du potentiel motivationnel des TIC pour l’apprentissage. Cependant, sur une cote maximale de 7 (correspond très fortement), la perception des répondants relativement à l’impact positif de l’utilisation des TIC sur le rendement académique tourne autour d’une cote moyenne de 4 (correspond assez). D’où, une perception en général mitigée du lien entre l’apprentissage assisté par les TIC et la réussite. Le croisement des résultats des données quantitatives avec ceux de l’analyse qualitative induit, sur ce point, une perception positive prononcée des rapports entre TIC et rendement. Les résultats montrent également que les usages des TIC les plus fréquents chez ces apprenants sont le courriel (en tête), suivi de la recherche et du traitement de texte, avec une fréquence moyenne d’ « une fois par semaine ». Tous ces constats n’accréditent pas véritablement un usage académique des TIC. Chez les enseignants, les résultats ont montré aussi qu’il n’y a pas encore de réelles applications des TIC en situation d’enseignement : ils font plutôt un usage personnel des TIC et pas encore véritablement pédagogique. La conséquence logique de ces résultats est qu’il n’existe pas encore un lien qualitatif direct entre TIC et rendement académique en contexte universitaire béninois.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La présente thèse examine les liens entre le sommeil, la mémoire épisodique et les rêves. Dans une première étude, nous utilisons les technologies de la réalité virtuelle (RV) en liaison avec un paradigme de privation de sommeil paradoxal et de collecte de rêve en vue d'examiner l'hypothèse que le sommeil paradoxal et le rêve sont impliqués dans la consolidation de la mémoire épisodique. Le sommeil paradoxal a été associé au rappel des aspects spatiaux des éléments émotionnels de la tâche RV. De la même façon, l'incorporation de la tâche RV dans les rêves a été associée au rappel des aspects spatiaux de la tâche. De plus, le rappel des aspects factuels et perceptuels de la mémoire épisodique, formé lors de la tâche VR, a été associé au sommeil aux ondes lentes. Une deuxième étude examine l'hypothèse selon laquelle une fonction possible du rêve pourrait être de créer de nouvelles associations entre les éléments de divers souvenirs épisodiques. Un participant a été réveillé 43 fois lors de l'endormissement pour fournir des rapports détaillés de rêves. Les résultats suggèrent qu'un seul rêve peut comporter, dans un même contexte spatiotemporel, divers éléments appartenant aux multiples souvenirs épisodiques. Une troisième étude aborde la question de la cognition lors du sommeil paradoxal, notamment comment les aspects bizarres des rêves, qui sont formés grâce aux nouvelles combinaisons d'éléments de la mémoire épisodique, sont perçus par le rêveur. Les résultats démontrent une dissociation dans les capacités cognitives en sommeil paradoxal caractérisée par un déficit sélectif dans l'appréciation des éléments bizarres des rêves. Les résultats des quatre études suggèrent que le sommeil aux ondes lentes et le sommeil paradoxal sont différemment impliqués dans le traitement de la mémoire épisodique. Le sommeil aux ondes lentes pourrait être implique dans la consolidation de la mémoire épisodique, et le sommeil paradoxal, par l'entremise du rêve, pourrais avoir le rôle d'introduire de la flexibilité dans ce système mnémonique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les études génétiques, telles que les études de liaison ou d’association, ont permis d’acquérir une plus grande connaissance sur l’étiologie de plusieurs maladies affectant les populations humaines. Même si une dizaine de milliers d’études génétiques ont été réalisées sur des centaines de maladies ou autres traits, une grande partie de leur héritabilité reste inexpliquée. Depuis une dizaine d’années, plusieurs percées dans le domaine de la génomique ont été réalisées. Par exemple, l’utilisation des micropuces d’hybridation génomique comparative à haute densité a permis de démontrer l’existence à grande échelle des variations et des polymorphismes en nombre de copies. Ces derniers sont maintenant détectables à l’aide de micropuce d’ADN ou du séquençage à haut débit. De plus, des études récentes utilisant le séquençage à haut débit ont permis de démontrer que la majorité des variations présentes dans l’exome d’un individu étaient rares ou même propres à cet individu. Ceci a permis la conception d’une nouvelle micropuce d’ADN permettant de déterminer rapidement et à faible coût le génotype de plusieurs milliers de variations rares pour un grand ensemble d’individus à la fois. Dans ce contexte, l’objectif général de cette thèse vise le développement de nouvelles méthodologies et de nouveaux outils bio-informatiques de haute performance permettant la détection, à de hauts critères de qualité, des variations en nombre de copies et des variations nucléotidiques rares dans le cadre d’études génétiques. Ces avancées permettront, à long terme, d’expliquer une plus grande partie de l’héritabilité manquante des traits complexes, poussant ainsi l’avancement des connaissances sur l’étiologie de ces derniers. Un algorithme permettant le partitionnement des polymorphismes en nombre de copies a donc été conçu, rendant possible l’utilisation de ces variations structurales dans le cadre d’étude de liaison génétique sur données familiales. Ensuite, une étude exploratoire a permis de caractériser les différents problèmes associés aux études génétiques utilisant des variations en nombre de copies rares sur des individus non reliés. Cette étude a été réalisée avec la collaboration du Wellcome Trust Centre for Human Genetics de l’University of Oxford. Par la suite, une comparaison de la performance des algorithmes de génotypage lors de leur utilisation avec une nouvelle micropuce d’ADN contenant une majorité de marqueurs rares a été réalisée. Finalement, un outil bio-informatique permettant de filtrer de façon efficace et rapide des données génétiques a été implémenté. Cet outil permet de générer des données de meilleure qualité, avec une meilleure reproductibilité des résultats, tout en diminuant les chances d’obtenir une fausse association.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Kalman filter algorithm has been applied to interpret the optical reflectance excursions during vacuum deposition of infrared coatings and multilayer thin-film filters. The application has been described in detail elsewhere and this paper now reports on-line experience for estimating deposition rate and thickness. The estimation proved sufficiently reliable to firstly 'navigate' regular manufacture (as controlled by a skilled operator) and to subsequently reproduce the skill without interpretation or intervention whilst maintaining exemplary product quality. Optical control by means of this Kalman filter application is therefore considered suitable as a basis for the automated manufacture of infrared coatings and multilayer thin-film filters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that atmospheric concentrations of carbon dioxide (CO2) (and other greenhouse gases) have increased markedly as a result of human activity since the industrial revolution. It is perhaps less appreciated that natural and managed soils are an important source and sink for atmospheric CO2 and that, primarily as a result of the activities of soil microorganisms, there is a soil-derived respiratory flux of CO2 to the atmosphere that overshadows by tenfold the annual CO2 flux from fossil fuel emissions. Therefore small changes in the soil carbon cycle could have large impacts on atmospheric CO2 concentrations. Here we discuss the role of soil microbes in the global carbon cycle and review the main methods that have been used to identify the microorganisms responsible for the processing of plant photosynthetic carbon inputs to soil. We discuss whether application of these techniques can provide the information required to underpin the management of agro-ecosystems for carbon sequestration and increased agricultural sustainability. We conclude that, although crucial in enabling the identification of plant-derived carbon-utilising microbes, current technologies lack the high-throughput ability to quantitatively apportion carbon use by phylogentic groups and its use efficiency and destination within the microbial metabolome. It is this information that is required to inform rational manipulation of the plant–soil system to favour organisms or physiologies most important for promoting soil carbon storage in agricultural soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a detailed study of the application of different additive manufacturing technologies (sintering process, three-dimensional printing, extrusion and stereolithographic process), in the design process of a complex geometry model and its moving parts. The fabrication sequence was evaluated in terms of pre-processing conditions (model generation and model STL SLI), generation strategy and physical model post-processing operations. Dimensional verification of the obtained models was undertook by projecting structured light (optical scan), a relatively new technology of main importance for metrology and reverse engineering. Studies were done in certain manufacturing time and production costs, which allowed the definition of an more comprehensive evaluation matrix of additive technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patients with congenital malformations, traumatic or pathological mutilation and maxillofacial developmental disorders can be restored aesthetically and emotionally by the production and use of facial prostheses. The aim of this study was to review the literature about the retention and processing methods of facial prostheses, and discuss their characteristics. A literature review on Medline (PubMed) database was performed by using the keywords maxillofacial prosthesis, silicone, resin, pigment, cosmetic, prosthetic nose, based on articles published from 1956 to 2010. Several methods of retention, from adhesives to the placement of implants, and different processing methods such as laser, CAD/CAM and rapid prototyping technologies have been reported. There are advantages and disadvantages of each procedure, and none can be classified as better compared to others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, there is an interest in technologies that favour the use of coproducts for animal nutrition. The effect of adding two enzyme mixtures in diets for dogs formulated with wheat bran (WB) was evaluated. Two foods with similar compositions were formulated: negative control (NC; without WB) and test diet (25% of WB). The test diet was divided into four treatments: without enzyme (positive control), enzyme mixture 1 (ENZ1; added before extrusion β-glucanase, xylanase, cellulase, glucoamylase, phytase); enzyme mixture 2 (ENZ2; added before extrusion the ENZ1 more α-amylase); enzyme mixture 2 added after the extrusion (ENZ2ex). ENZ1 and ENZ2 were used to evaluate the enzyme effect on extruder pre-conditioner (processing additive) and ENZ2ex to evaluate the effect of enzyme supplementation for the animal. Digestibility was measured through total collection of faeces and urine. The experiment followed a randomized block design with five treatments (diets) and six dogs per diet, totalling 30 dogs (7.0 ± 1.2 years old and 11.0 ± 2.2 kg of body weight). Data were submitted to analysis of variance and means compared by Tukey's test and orthogonal contrasts (p < 0.05). Reducing sugars showed an important reduction after extrusion, suggesting the formation of carbohydrate complexes. The apparent total tract digestibility (ATTD) of dry matter, organic matter, crude protein, acid-hydrolysed fat and energy was higher in NC than in diets with WB (p < 0.001), without effects of enzyme additions. WB diets resulted in higher faecal production and concentration of short-chain fatty acids (SCFA) and reduced pH and ammonia concentration (p < 0.01), with no effect of enzyme addition. The enzyme addition did not result in improved digestibility of a diet high in non-starch polysaccharides; however, only ATTD was measured and nutrient fermentation in the large intestine may have interfered with the results obtained. WB modified fermentation product formation in the colon of dogs. © 2013 Blackwell Verlag GmbH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.