859 resultados para Business enterprises -- Electronic data processing -- Study and teaching (Higher) -- Chile
Resumo:
This study aimed to investigate the phenomenology of obsessive compulsive disorder (OCD), addressing specific questions about the nature of obsessions and compulsions, and to contribute to the World Health Organization's (WHO) revision of OCD diagnostic guidelines. Data from 1001 patients from the Brazilian Research Consortium on Obsessive Compulsive Spectrum Disorders were used. Patients were evaluated by trained clinicians using validated instruments, including the Dimensional Yale Brown Obsessive Compulsive Scale, the University of Sao Paulo Sensory Phenomena Scale, and the Brown Assessment of Beliefs Scale. The aims were to compare the types of sensory phenomena (SP, subjective experiences that precede or accompany compulsions) in OCD patients with and without tic disorders and to determine the frequency of mental compulsions, the co-occurrence of obsessions and compulsions, and the range of insight. SP were common in the whole sample, but patients with tic disorders were more likely to have physical sensations and urges only. Mental compulsions occurred in the majority of OCD patients. It was extremely rare for OCD patients to have obsessions without compulsions. A wide range of insight into OCD beliefs was observed, with a small subset presenting no insight. The data generated from this large sample will help practicing clinicians appreciate the full range of OCD symptoms and confirm prior studies in smaller samples the degree to which insight varies. These findings also support specific revisions to the WHO's diagnostic guidelines for OCD, such as describing sensory phenomena, mental compulsions and level of insight, so that the world-wide recognition of this disabling disorder is increased. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In 1997, the National Curriculum Parameters (BRASIL, 1997a) proposed the incorporation of the Transversal Themes in all of the knowledge areas. Specifically in the School Physical Education, the teacher working with transversality must stabilish bridges between the body culture practice with these themes. Promote this relationship is not a simple task and there are many problems that stuck the advance of this propose. So, thinking about making didactical formulat teaching stragies and think about the development of their practice. Another challenge is considerate as a beginning in the pedagogic activity the student’s knowledge and experience, as listening during the process of knowledge construction. In this way, the aim of this research is investigate the 8th and 9th grade of the primary school students conception about Cultural Plurality. After this, using the information taken gron the students, we are going to present subisidies to make a textbook, for the teachers and for students, with Physical Education activities possibilies with this Transversal Theme. Keeping in mind the aim of this research, we used the semi-structed interview as a instrument of data collection. The interview was made with 30 students of the 8th and 9th grade the primary school from two statual schools of Rio Claro city. The procediments were executed following the ethics comite requirements. In other to associate the Cultural Plurality with Physical Education, dance and physical adventure activities contents were selected. The results showed that the students know few thinks about this Transversal Themes, because they had apoor knowledge, according to National Curriculum Parameters. In scholar environment and Physical Education classes, there are many cases of prejudice and discrimination, mostly by the leack of ability. Nonetheless, most of the students are able to...(Complete abstract click electronic access below)
Resumo:
The current work aims to analyze the contributions of PIBID for initial graduation of the teachers, in view of the undergraduates of Physics Course at UNESP Guaratinguetá . A brief literature survey about the national situation of teacher education was performed, at which the main difficulties and challenges, as well as the solutions stated in place by the government to solve them, have been highlighted. A description of the historical of PIBID at UNESP, from the institutional project to reach the subproject developed by Physics Course of Guaratinguetá, was presented. To characterize the development of the subproject, it was performed a survey on the activities performed by the PIBID’s scholars, since its implementation on campus in 2010 until the end of the year 2013, in order to map the experiences lived by program participants. These data were obtained from the analysis of reports by scholars themselves, video analyzes and record of the weekly meetings held by the group, reading electronic messages exchanged on a specific group of e-mails and written evaluations by members of the program. Completing data collection, eleven scholars undergraduates of the program were interviewed, and the results were classified by topics, defined from recurrence in the interviewees speech. The global analysis of the data was based on theoretical references commonly used in research on graduation of teachers, as Nóvoa (1992), Mizukami (2005, 2006) and Gatti (2008). The results indicate that the licensed ones see in PIBID a differentiated opportunity for initial teaching graduation, for adding practical learning experiences for students of EB within the school context, and especially for creating space for reflection on their experiences with the support of more experienced teachers, committed to the training of all involved ones
Resumo:
Autologous fibrin gel is commonly used as a scaffold for filling defects in articular cartilage. This biomaterial can also be used as a sealant to control small hemorrhages and is especially helpful in situations where tissue reparation capacity is limited. In particular, fibrin can act as a scaffold for various cell types because it can accommodate cell migration, differentiation, and proliferation. Despite knowledge of the advantages of this biomaterial and mastery of the techniques required for its application, the durability of several types of sealant at the site of injury remains questionable. Due to the importance of such data for evaluating the quality and efficiency of fibrin gel formulations on its use as a scaffold, this study sought to analyze the heterologous fibrin sealant developed from the venom of Crotalus durissus terrificus using studies in ovine experimental models. The fibrin gel developed from the venom of this snake was shown to act as a safe, stable, and durable scaffold for up to seven days, without causing adverse side effects. Fibrin gel produced from the venom of the Crotalus durissus terrificus snake possesses many clinical and surgical uses. It presents the potential to be used as a biomaterial to help repair skin lesions or control bleeding, and it may also be used as a scaffold when applied together with various cell types. The intralesional use of the fibrin gel from the venom of this snake may improve surgical and clinical treatments in addition to being inexpensive and adequately consistent, durable, and stable. The new heterologous fibrin sealant is a scaffold candidate to cartilage repair in this study.
Resumo:
Pós-graduação em Educação Matemática - IGCE
Resumo:
The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.
Resumo:
The objective of this study was to identify the socioeconomic and demographic characteristics of children and adolescents who study and work outside their home. This non-experimental, correlational, cross-sectional study was performed using questionnaires applied to primary education students, enrolled in public schools in Ribeirao Preto (Brazil). Two schools were selected through a draw. Data analysis was performed using Statistical Package for Social Sciences, version 14.0. Of the 133 students who answered the questionnaire, 36 (27.7%) reported working outside their home, 20.6% were between 11 and 13 years of age, and 66.7% were male (p=0.000) and had started working early to help with the family income (p=0.003). The salary they received helped comprise the family income, and it was found that as the family income increased, the need for the youngsters to work was reduced. It was found that many factors contribute to these subjects' early start at work, including family size, structure and poverty.
Resumo:
Defining pharmacokinetic parameters and depletion intervals for antimicrobials used in fish represents important guidelines for future regulation by Brazilian agencies of the use of these substances in fish farming. This article presents a depletion study for oxytetracycline (OTC) in tilapias (Orechromis niloticus) farmed under tropical conditions during the winter season. High performance liquid chromatography, with fluorescence detection for the quantitation of OTC in tilapia fillets and medicated feed, was developed and validated. The depletion study with fish was carried out under monitored environmental conditions. OTC was administered in the feed for five consecutive days at daily dosages of 80 mg/kg body weight. Groups of ten fish were slaughtered at 1, 2, 3, 4, 5, 8, 10, 15, 20, and 25 days after medication. After the 8th day posttreatment, OTC concentrations in the tilapia fillets were below the limit of quantitation (13 ng/g) of the method. Linear regression of the mathematical model of data analysis presented a coefficient of 0.9962. The elimination half- life for OTC in tilapia fillet and the withdrawal period were 1.65 and 6 days, respectively, considering a percentile of 99% with 95% of confidence and a maximum residue limit of 100 ng/g. Even though the study was carried out in the winter under practical conditions where water temperature varied, the results obtained are similar to others from studies conducted under controlled temperature.
Resumo:
A new method for analysis of scattering data from lamellar bilayer systems is presented. The method employs a form-free description of the cross-section structure of the bilayer and the fit is performed directly to the scattering data, introducing also a structure factor when required. The cross-section structure (electron density profile in the case of X-ray scattering) is described by a set of Gaussian functions and the technique is termed Gaussian deconvolution. The coefficients of the Gaussians are optimized using a constrained least-squares routine that induces smoothness of the electron density profile. The optimization is coupled with the point-of-inflection method for determining the optimal weight of the smoothness. With the new approach, it is possible to optimize simultaneously the form factor, structure factor and several other parameters in the model. The applicability of this method is demonstrated by using it in a study of a multilamellar system composed of lecithin bilayers, where the form factor and structure factor are obtained simultaneously, and the obtained results provided new insight into this very well known system.
Resumo:
Conjugated polymers have attracted tremendous academical and industrial research interest over the past decades due to the appealing advantages that organic / polymeric materials offer for electronic applications and devices such as organic light emitting diodes (OLED), organic field effect transistors (OFET), organic solar cells (OSC), photodiodes and plastic lasers. The optimization of organic materials for applications in optoelectronic devices requires detailed knowledge of their photophysical properties, for instance energy levels of excited singlet and triplet states, excited state decay mechanisms and charge carrier mobilities. In the present work a variety of different conjugated (co)polymers, mainly polyspirobifluorene- and polyfluorene-type materials, was investigated using time-resolved photoluminescence spectroscopy in the picosecond to second time domain to study their elementary photophysical properties and to get a deeper insight into structure-property relationships. The experiments cover fluorescence spectroscopy using Streak Camera techniques as well as time-delayed gated detection techniques for the investigation of delayed fluorescence and phosphorescence. All measurements were performed on the solid state, i.e. thin polymer films and on diluted solutions. Starting from the elementary photophysical properties of conjugated polymers the experiments were extended to studies of singlet and triplet energy transfer processes in polymer blends, polymer-triplet emitter blends and copolymers. The phenomenon of photonenergy upconversion was investigated in blue light-emitting polymer matrices doped with metallated porphyrin derivatives supposing an bimolecular annihilation upconversion mechanism which could be experimentally verified on a series of copolymers. This mechanism allows for more efficient photonenergy upconversion than previously reported for polyfluorene derivatives. In addition to the above described spectroscopical experiments, amplified spontaneous emission (ASE) in thin film polymer waveguides was studied employing a fully-arylated poly(indenofluorene) as the gain medium. It was found that the material exhibits a very low threshold value for amplification of blue light combined with an excellent oxidative stability, which makes it interesting as active material for organic solid state lasers. Apart from spectroscopical experiments, transient photocurrent measurements on conjugated polymers were performed as well to elucidate the charge carrier mobility in the solid state, which is an important material parameter for device applications. A modified time-of-flight (TOF) technique using a charge carrier generation layer allowed to study hole transport in a series of spirobifluorene copolymers to unravel the structure-mobility relationship by comparison with the homopolymer. Not only the charge carrier mobility could be determined for the series of polymers but also field- and temperature-dependent measurements analyzed in the framework of the Gaussian disorder model showed that results coincide very well with the predictions of the model. Thus, the validity of the disorder concept for charge carrier transport in amorphous glassy materials could be verified for the investigated series of copolymers.
Resumo:
Data deduplication describes a class of approaches that reduce the storage capacity needed to store data or the amount of data that has to be transferred over a network. These approaches detect coarse-grained redundancies within a data set, e.g. a file system, and remove them.rnrnOne of the most important applications of data deduplication are backup storage systems where these approaches are able to reduce the storage requirements to a small fraction of the logical backup data size.rnThis thesis introduces multiple new extensions of so-called fingerprinting-based data deduplication. It starts with the presentation of a novel system design, which allows using a cluster of servers to perform exact data deduplication with small chunks in a scalable way.rnrnAfterwards, a combination of compression approaches for an important, but often over- looked, data structure in data deduplication systems, so called block and file recipes, is introduced. Using these compression approaches that exploit unique properties of data deduplication systems, the size of these recipes can be reduced by more than 92% in all investigated data sets. As file recipes can occupy a significant fraction of the overall storage capacity of data deduplication systems, the compression enables significant savings.rnrnA technique to increase the write throughput of data deduplication systems, based on the aforementioned block and file recipes, is introduced next. The novel Block Locality Caching (BLC) uses properties of block and file recipes to overcome the chunk lookup disk bottleneck of data deduplication systems. This chunk lookup disk bottleneck either limits the scalability or the throughput of data deduplication systems. The presented BLC overcomes the disk bottleneck more efficiently than existing approaches. Furthermore, it is shown that it is less prone to aging effects.rnrnFinally, it is investigated if large HPC storage systems inhibit redundancies that can be found by fingerprinting-based data deduplication. Over 3 PB of HPC storage data from different data sets have been analyzed. In most data sets, between 20 and 30% of the data can be classified as redundant. According to these results, future work in HPC storage systems should further investigate how data deduplication can be integrated into future HPC storage systems.rnrnThis thesis presents important novel work in different area of data deduplication re- search.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.
Resumo:
This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.