799 resultados para Management - Data processing - Study and teaching (Higher) - Victoria
Resumo:
This study aimed to investigate the phenomenology of obsessive compulsive disorder (OCD), addressing specific questions about the nature of obsessions and compulsions, and to contribute to the World Health Organization's (WHO) revision of OCD diagnostic guidelines. Data from 1001 patients from the Brazilian Research Consortium on Obsessive Compulsive Spectrum Disorders were used. Patients were evaluated by trained clinicians using validated instruments, including the Dimensional Yale Brown Obsessive Compulsive Scale, the University of Sao Paulo Sensory Phenomena Scale, and the Brown Assessment of Beliefs Scale. The aims were to compare the types of sensory phenomena (SP, subjective experiences that precede or accompany compulsions) in OCD patients with and without tic disorders and to determine the frequency of mental compulsions, the co-occurrence of obsessions and compulsions, and the range of insight. SP were common in the whole sample, but patients with tic disorders were more likely to have physical sensations and urges only. Mental compulsions occurred in the majority of OCD patients. It was extremely rare for OCD patients to have obsessions without compulsions. A wide range of insight into OCD beliefs was observed, with a small subset presenting no insight. The data generated from this large sample will help practicing clinicians appreciate the full range of OCD symptoms and confirm prior studies in smaller samples the degree to which insight varies. These findings also support specific revisions to the WHO's diagnostic guidelines for OCD, such as describing sensory phenomena, mental compulsions and level of insight, so that the world-wide recognition of this disabling disorder is increased. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
This study aimed to analyze the limits and possibilities of a didactic sequence, addressing the concept of proportionality in the learning of students in 5th year of elementary school. From the results of the Diagnostic Assessment Initial applied to students in a class of 5th grade of elementary school, we selected eight (8) students who participated in the constituent activities of that instructional sequence. For the development of this sequence were used theoretical elements of Genetic Epistemology (Piaget, 1990), the theory of fields of Conceptual Vergnaud (1996) and the definition of instructional sequence proposed by Zabala (1998). Finally, after 20 days of application of these activities, we applied the Diagnostic Assessment Final aimed at investigating which concepts related to proportionality were built by the participating students. By analyzing the collected data, it can be concluded the following about the possibilities of teaching sequence: 1) students who, in Diagnostica Home Assessment used the multiplication algorithm, but also did not indicate the comparison between quantities are now performing; 2) students who, in the Diagnostic Assessment Initial, used the addition algorithm, now gradually are using the multiplication algorithm; 3)students who, in the Diagnostic Assessment Initial, could not solve problems involving the idea of proportionality, are now resolving, however, still use, in large part, the addition operation as a strategy for this proportional thinking. Now, in relation to the limits of the sequence noted: 1)time sufficient to propose a higher number of problem conditions to be carried by students who may have been a trigger of construction have not through the proportional multiplication;2)the distant proposition situations of assimilation schemes (interpretation) of the students, which may have caused certain imbalance in them, preventing them from thinking about the problem situation;3)the rapid passage of...
Resumo:
The influence of weight (W) category of the rainbow trout on processing yield and chemical composition of the entire eviscerated fish and fish fillet was analyzed. A completely randomized design was employed for processing variables (W1 = 300 to 370 g and W2 = 371 to 440) coupled to a 2 x 2 factorial scheme for the chemical composition (W1 and W2 and forms of presentation: fillet and whole eviscerated fish). W1 showed higher yield for entire eviscerated fish (83.00%) and head (13.27%), but a lower yield for the viscera (17.00%), when compared to W2. We did not affect abdominal muscle yield, fillet with or without skin, skin percentage and residues. There were significant differences between W for moisture (W1 = 72.30% and W2 = 71.15%) and lipids (CP1 = 7.96% and CP2 = 9.04%) rates. Fillet moisture contents (73.74%) and crude protein (19.05%) were higher (p < 0.01) than for entire eviscerated fish (69.71% and 17.81%, respectively). Ash (2.15%) and lipid (10.48%) rates were higher (p < 0.01) for entire fish when compared to those of fillets (1.16% and 6.52%, respectively). The slaughter of fish weighing between 300 and 370 g and their fillets are more adequate for the market.
Resumo:
The slick hair coat (SLICK) is a dominantly inherited trait typically associated with tropically adapted cattle that are from Criollo descent through Spanish colonization of cattle into the New World. The trait is of interest relative to climate change, due to its association with improved thermo-tolerance and subsequent increased productivity. Previous studies localized the SLICK locus to a 4 cM region on chromosome (BTA) 20 and identified signatures of selection in this region derived from Senepol cattle. The current study compares three slick-haired Criollo-derived breeds including Senepol, Carora, and Romosinuano and three additional slick-haired cross-bred lineages to non-slick ancestral breeds. Genome-wide association (GWA), haplotype analysis, signatures of selection, runs of homozygosity (ROH), and identity by state (IBS) calculations were used to identify a 0.8 Mb (37.7-38.5 Mb) consensus region for the SLICK locus on BTA20 in which contains SKP2 and SPEF2 as possible candidate genes. Three specific haplotype patterns are identified in slick individuals, all with zero frequency in non-slick individuals. Admixture analysis identified common genetic patterns between the three slick breeds at the SLICK locus. Principal component analysis (PCA) and admixture results show Senepol and Romosinuano sharing a higher degree of genetic similarity to one another with a much lesser degree of similarity to Carora. Variation in GWA, haplotype analysis, and IBS calculations with accompanying population structure information supports potentially two mutations, one common to Senepol and Romosinuano and another in Carora, effecting genes contained within our refined location for the SLICK locus.
Resumo:
Autologous fibrin gel is commonly used as a scaffold for filling defects in articular cartilage. This biomaterial can also be used as a sealant to control small hemorrhages and is especially helpful in situations where tissue reparation capacity is limited. In particular, fibrin can act as a scaffold for various cell types because it can accommodate cell migration, differentiation, and proliferation. Despite knowledge of the advantages of this biomaterial and mastery of the techniques required for its application, the durability of several types of sealant at the site of injury remains questionable. Due to the importance of such data for evaluating the quality and efficiency of fibrin gel formulations on its use as a scaffold, this study sought to analyze the heterologous fibrin sealant developed from the venom of Crotalus durissus terrificus using studies in ovine experimental models. The fibrin gel developed from the venom of this snake was shown to act as a safe, stable, and durable scaffold for up to seven days, without causing adverse side effects. Fibrin gel produced from the venom of the Crotalus durissus terrificus snake possesses many clinical and surgical uses. It presents the potential to be used as a biomaterial to help repair skin lesions or control bleeding, and it may also be used as a scaffold when applied together with various cell types. The intralesional use of the fibrin gel from the venom of this snake may improve surgical and clinical treatments in addition to being inexpensive and adequately consistent, durable, and stable. The new heterologous fibrin sealant is a scaffold candidate to cartilage repair in this study.
Resumo:
This study analyzes the environmental performance of the Municipal Solid Waste Management System (MSWMS) of Piedade, São Paulo, from a systemic perspective. A life cycle assessment (LCA) technique was applied according to an attributional approach to evaluate both the current operational situation and different prospective scenarios, which were devised based on the application of targets for recycling dry and wet waste suggested by the pre-draft version of the Brazilian Plan for Solid Waste. The life cycle impact assessment method EcoIndicator 99, in association with normalization and weighting procedures, was used to conduct the analysis. It was observed that the adoption of goals of 30%, 50% and 70% for recovering of the recyclable dry waste, resulted in improvement of the environmental performance of the waste management system under analysis, respectively of 10%, 15% and 20%. It was also possible to detect an evolution in the order of 54% in reducing impacts resulting from the adoption of targets for composting. LCA proved to be effective for the evaluation of the environmental performance of MSWMS-Piedade. However, for future evaluations, the attributional approach should be replaced by the methodological practice of substitution to enable the avoided burdens to be considered in estimations of the environmental performance municipal solid waste management systems.
Resumo:
The objective of this study was to identify the socioeconomic and demographic characteristics of children and adolescents who study and work outside their home. This non-experimental, correlational, cross-sectional study was performed using questionnaires applied to primary education students, enrolled in public schools in Ribeirao Preto (Brazil). Two schools were selected through a draw. Data analysis was performed using Statistical Package for Social Sciences, version 14.0. Of the 133 students who answered the questionnaire, 36 (27.7%) reported working outside their home, 20.6% were between 11 and 13 years of age, and 66.7% were male (p=0.000) and had started working early to help with the family income (p=0.003). The salary they received helped comprise the family income, and it was found that as the family income increased, the need for the youngsters to work was reduced. It was found that many factors contribute to these subjects' early start at work, including family size, structure and poverty.
Resumo:
Defining pharmacokinetic parameters and depletion intervals for antimicrobials used in fish represents important guidelines for future regulation by Brazilian agencies of the use of these substances in fish farming. This article presents a depletion study for oxytetracycline (OTC) in tilapias (Orechromis niloticus) farmed under tropical conditions during the winter season. High performance liquid chromatography, with fluorescence detection for the quantitation of OTC in tilapia fillets and medicated feed, was developed and validated. The depletion study with fish was carried out under monitored environmental conditions. OTC was administered in the feed for five consecutive days at daily dosages of 80 mg/kg body weight. Groups of ten fish were slaughtered at 1, 2, 3, 4, 5, 8, 10, 15, 20, and 25 days after medication. After the 8th day posttreatment, OTC concentrations in the tilapia fillets were below the limit of quantitation (13 ng/g) of the method. Linear regression of the mathematical model of data analysis presented a coefficient of 0.9962. The elimination half- life for OTC in tilapia fillet and the withdrawal period were 1.65 and 6 days, respectively, considering a percentile of 99% with 95% of confidence and a maximum residue limit of 100 ng/g. Even though the study was carried out in the winter under practical conditions where water temperature varied, the results obtained are similar to others from studies conducted under controlled temperature.
Resumo:
Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.
Resumo:
Some properties of canna (Canna indica L.) and bore (Alocasia macrorrhiza) starches were evaluated and compared using cassava starch (Manihot esculenta Crantz) as a reference. Proximate analysis, differential scanning calorimetry (DSC), thermogravimetric analysis (TGA), Fourier transform infrared spectroscopy (FTIR), X-ray diffraction (XRD) and viscosity measurements were performed. Canna and bore starches showed a similar degree of purity as that of the cassava starch. Canna starch exhibited higher thermal stability and viscosity of solution values than those of bore and cassava starches. XRD spectra showed that canna starch crystallizes as a B-type structure; however, bore and cassava starches crystallize as an A-type structure. Results proved that canna and bore starches are promising bio(materials), obtained from unconventional sources, to be used for industrial applications, as their physicochemical properties are similar to those of cassava starch, which it is known has potential applications in this area.
Resumo:
A new method for analysis of scattering data from lamellar bilayer systems is presented. The method employs a form-free description of the cross-section structure of the bilayer and the fit is performed directly to the scattering data, introducing also a structure factor when required. The cross-section structure (electron density profile in the case of X-ray scattering) is described by a set of Gaussian functions and the technique is termed Gaussian deconvolution. The coefficients of the Gaussians are optimized using a constrained least-squares routine that induces smoothness of the electron density profile. The optimization is coupled with the point-of-inflection method for determining the optimal weight of the smoothness. With the new approach, it is possible to optimize simultaneously the form factor, structure factor and several other parameters in the model. The applicability of this method is demonstrated by using it in a study of a multilamellar system composed of lecithin bilayers, where the form factor and structure factor are obtained simultaneously, and the obtained results provided new insight into this very well known system.
Resumo:
Turfgrasses are ubiquitous in urban landscape and their role on carbon (C) cycle is increasing important also due to the considerable footprint related to their management practices. It is crucial to understand the mechanisms driving the C assimilation potential of these terrestrial ecosystems Several approaches have been proposed to assess C dynamics: micro-meteorological methods, small-chamber enclosure system (SC), chrono-sequence approach and various models. Natural and human-induced variables influence turfgrasses C fluxes. Species composition, environmental conditions, site characteristics, former land use and agronomic management are the most important factors considered in literature driving C sequestration potential. At the same time different approaches seem to influence C budget estimates. In order to study the effect of different management intensities on turfgrass, we estimated net ecosystem exchange (NEE) through a SC approach in a hole of a golf course in the province of Verona (Italy) for one year. The SC approach presented several advantages but also limits related to the measurement frequency, timing and duration overtime, and to the methodological errors connected to the measuring system. Daily CO2 fluxes changed according to the intensity of maintenance, likely due to different inputs and disturbances affecting biogeochemical cycles, combined also to the different leaf area index (LAI). The annual cumulative NEE decreased with the increase of the intensity of management. NEE was related to the seasonality of turfgrass, following temperatures and physiological activity. Generally on the growing season CO2 fluxes towards atmosphere exceeded C sequestered. The cumulative NEE showed a system near to a steady state for C dynamics. In the final part greenhouse gases (GHGs) emissions due to fossil fuel consumption for turfgrass upkeep were estimated, pinpointing that turfgrass may result a considerable C source. The C potential of trees and shrubs needs to be considered to obtain a complete budget.
Resumo:
Coral reefs are the most biodiverse ecosystems of the ocean and they provide notable ecosystem services. Nowadays, they are facing a number of local anthropogenic threats and environmental change is threatening their survivorship on a global scale. Large-scale monitoring is necessary to understand environmental changes and to perform useful conservation measurements. Governmental agencies are often underfunded and are not able of sustain the necessary spatial and temporal large-scale monitoring. To overcome the economic constrains, in some cases scientists can engage volunteers in environmental monitoring. Citizen Science enables the collection and analysis of scientific data at larger spatial and temporal scales than otherwise possible, addressing issues that are otherwise logistically or financially unfeasible. “STE: Scuba Tourism for the Environment” was a volunteer-based Red Sea coral reef biodiversity monitoring program. SCUBA divers and snorkelers were involved in the collection of data for 72 taxa, by completing survey questionnaires after their dives. In my thesis, I evaluated the reliability of the data collected by volunteers, comparing their questionnaires with those completed by professional scientists. Validation trials showed a sufficient level of reliability, indicating that non-specialists performed similarly to conservation volunteer divers on accurate transects. Using the data collected by volunteers, I developed a biodiversity index that revealed spatial trends across surveyed areas. The project results provided important feedbacks to the local authorities on the current health status of Red Sea coral reefs and on the effectiveness of the environmental management. I also analysed the spatial and temporal distribution of each surveyed taxa, identifying abundance trends related with anthropogenic impacts. Finally, I evaluated the effectiveness of the project to increase the environmental education of volunteers and showed that the participation in STEproject significantly increased both the knowledge on coral reef biology and ecology and the awareness of human behavioural impacts on the environment.
Resumo:
Data deduplication describes a class of approaches that reduce the storage capacity needed to store data or the amount of data that has to be transferred over a network. These approaches detect coarse-grained redundancies within a data set, e.g. a file system, and remove them.rnrnOne of the most important applications of data deduplication are backup storage systems where these approaches are able to reduce the storage requirements to a small fraction of the logical backup data size.rnThis thesis introduces multiple new extensions of so-called fingerprinting-based data deduplication. It starts with the presentation of a novel system design, which allows using a cluster of servers to perform exact data deduplication with small chunks in a scalable way.rnrnAfterwards, a combination of compression approaches for an important, but often over- looked, data structure in data deduplication systems, so called block and file recipes, is introduced. Using these compression approaches that exploit unique properties of data deduplication systems, the size of these recipes can be reduced by more than 92% in all investigated data sets. As file recipes can occupy a significant fraction of the overall storage capacity of data deduplication systems, the compression enables significant savings.rnrnA technique to increase the write throughput of data deduplication systems, based on the aforementioned block and file recipes, is introduced next. The novel Block Locality Caching (BLC) uses properties of block and file recipes to overcome the chunk lookup disk bottleneck of data deduplication systems. This chunk lookup disk bottleneck either limits the scalability or the throughput of data deduplication systems. The presented BLC overcomes the disk bottleneck more efficiently than existing approaches. Furthermore, it is shown that it is less prone to aging effects.rnrnFinally, it is investigated if large HPC storage systems inhibit redundancies that can be found by fingerprinting-based data deduplication. Over 3 PB of HPC storage data from different data sets have been analyzed. In most data sets, between 20 and 30% of the data can be classified as redundant. According to these results, future work in HPC storage systems should further investigate how data deduplication can be integrated into future HPC storage systems.rnrnThis thesis presents important novel work in different area of data deduplication re- search.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.