873 resultados para Data storage equipment
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Cassava leaves have been widely used as a protein source for ruminants in the tropics. However, these leaves contain high level of hydro-cyanic acid (HCN) and condensed tannins (CT). There are evidences that making hay can eliminate more than 90% of HCN and that long-term storage can reduce CT levels. A complete randomized design with four replicates was conducted to determine the effect of different storage times (0-control, 60, 90 and 120 days) on chemical composition, in vitro rumen fermentation kinetics, digestibility and energy value of cassava leaves hay. Treatments were compared by analyzing variables using the GLM procedure (SAS 9.1, SAS Institute, Inc., Cary, NC). Crude protein (CP) and ether extract (EE) of the cassava hay were not affected (P > 0.05) by storage time (17.7% and 3.0%, respectively). Neutral detergent fiber, acid detergent fiber, total carbohydrate and non-fiber carbohydrate were not affected either (P>0.05) by storage time (47.5, 32.6, 72.3 and 25.8% respectively). However, other parameters were influenced. CT was lower (P<0.05) in hay after 120 days of storage compared with control (1.75% versus 3.75%, respectively). Lignin and insoluble nitrogen in neutral detergent, analyzed without sodium sulfite, were higher (P<0.01) after 120 days of storage, compared with the control (11.22 versus 13.57 and 1.65 versus 3.81% respectively). This suggests that the CT has bound to the fiber or CP and became inactive. Consequently, the in vitro digestibility of organic matter (50.36%), total digestible nutrients (44.79%) and energy (1.61 Mcal/KgMS), obtained from gas production data at 72 h of incubation, has increased (P<0.05) with storage times (56.83%, 51.53% and 1.86 Mcal/KgMS, respectively). The chemical composition and fermentative characteristics of cassava hay suffered variations during the storage period. The best values were obtained after 90 days of storage. This is probably due to the reduction in condensed tannins.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The search for more reliable security systems and information management of these systems is leading to a growing progress in new technologies investments that allow the implementation of equipment with a high level of reliability, but also have an agile and practical operation. This led people to turn increasingly looking for home automation systems, enterprise and industry for the automation and integration of their systems. The identification by radio frequency is very widespread today for ensuring both agility in handling records data, the reliability of their identification systems, which are increasingly advanced and less susceptible to fraud. Attached to this technology, the use of the database is always very important for the storage of information collected, the area where the MySQL platform is widely used. Using the open source Arduino platform for programming and manipulation of RFID module and LabVIEW software for the union of all these technologies and to develop a user-friendly interface, you can create a highly reliable access control and agility places a high turnover of people. This project aims to prove the advantages of using all these technologies working together, thus improving a flawed system effectively safety, cheaper and quicker
Resumo:
In laboratory tests, just following the manufacturer’s instructions of materials or equipments may not be enough to obtain reliable data. Thus, intra and inter-examiners studies may be required. The aim of this study was to illustrate the application of the Intraclass Correlation Coefficient (r) in a study related to the color changing and hardness of composite resin conducted by two examiners. The color and hardness of 10 specimens of composite resin were evaluated by a colorimetric spectrophotometer and hardness tester, respectively, at two different times. Two trained examiners performed two measurements with an interval of one week. Specimens were stored in artificial saliva at 37±1ºC for 7 days. The Intraclass Correlation was used for analysis of intra and inter-examiner reproducibility. The intra-examiner reproducibility was 0.79, 0.44 and 0.76 for L, a and b color coordinates for the examiner 1 and 0.84, 0.23, 0.21 for examiner 2. For hardness, r-values were 0.00 and 0.23 for the examiner 1 and 2, respectively, showing unsatisfactory agreement on color and hardness evaluation for both examiners. It was noted that only the observation of protocol for use of equipment and examiners training were not sufficient to collect reliable information. Thus, adjustments in the experimental protocol were made and devices were produced to standardize the measurements. The reproducibility study was performed again. For examiner 1, values as 0.90, 0.59 and 0.79 were verified for L, a and b coordinates and, for examiner 2 were obtained values of 0.90, 0.75, 0.95. In relation to hardness, r-values of 0.75 and 0.71 were obtained for examiners 1 and 2, respectively. The inter-examiner reproducibility was 0.86, 0.87, 0.91 for L, a and b coordinates and 0.79 for hardness. It was concluded that intra and inter-examiner calibration is essential for obtaining quality measurements in laboratory tests.
Resumo:
This study evaluated the efficiency of a toothbrush holder to prevent contamination of toothbrushes used by preschool children. For that, the sample was composed by children 6 years old, enrolled in an educational and recreational center in Araraquara/SP, and divided into 3 groups: G1: same continuing routine storage toothbrushes, G2: children received only a new toothbrush holder for storage; G3: they received new toothbrushes holder for storage and solution of chlorhexidine digluconate to 0.12% to dabble in the toothbrush after use. After brushing their teeth, toothbrushes were collected for microbiological analysis. The data were analyzed using the distribution of frequencies. It was observed that, in general, higher prevalence of the microorganism in the toothbrushes was Streptococcus viridans (58.97%), followed by Estafilococcus (35.90%), the bacillus of air (28.21%) and Neisseria mucosa (5.13%). Evaluating frequency, it was noted that the contamination presented by Streptococcus is higher in G1 when compared to G2 and G3, while for Estafilococcus, the presence was more significant in G3. Thus, it was concluded that the use of new toothbrushes holder able to avoid direct contact between brushes and allow drying without smothering could be an excellent alternative to educational institutions that require the storage group.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This study aimed to assess measurements of temperature and relative humidity obtained with HOBO a data logger, under various conditions of exposure to solar radiation, comparing them with those obtained through the use of a temperature/relative humidity probe and a copper-constantan thermocouple psychrometer, which are considered the standards for obtaining such measurements. Data were collected over a 6-day period (from 25 March to 1 April, 2010), during which the equipment was monitored continuously and simultaneously. We employed the following combinations of equipment and conditions: a HOBO data logger in full sunlight; a HOBO data logger shielded within a white plastic cup with windows for air circulation; a HOBO data logger shielded within a gill-type shelter (multi-plate prototype plastic); a copper-constantan thermocouple psychrometer exposed to natural ventilation and protected from sunlight; and a temperature/relative humidity probe under a commercial, multi-plate radiation shield. Comparisons between the measurements obtained with the various devices were made on the basis of statistical indicators: linear regression, with coefficient of determination; index of agreement; maximum absolute error; and mean absolute error. The prototype multi-plate shelter (gill-type) used in order to protect the HOBO data logger was found to provide the best protection against the effects of solar radiation on measurements of temperature and relative humidity. The precision and accuracy of a device that measures temperature and relative humidity depend on an efficient shelter that minimizes the interference caused by solar radiation, thereby avoiding erroneous analysis of the data obtained.
Resumo:
Hundreds of Terabytes of CMS (Compact Muon Solenoid) data are being accumulated for storage day by day at the University of Nebraska-Lincoln, which is one of the eight US CMS Tier-2 sites. Managing this data includes retaining useful CMS data sets and clearing storage space for newly arriving data by deleting less useful data sets. This is an important task that is currently being done manually and it requires a large amount of time. The overall objective of this study was to develop a methodology to help identify the data sets to be deleted when there is a requirement for storage space. CMS data is stored using HDFS (Hadoop Distributed File System). HDFS logs give information regarding file access operations. Hadoop MapReduce was used to feed information in these logs to Support Vector Machines (SVMs), a machine learning algorithm applicable to classification and regression which is used in this Thesis to develop a classifier. Time elapsed in data set classification by this method is dependent on the size of the input HDFS log file since the algorithmic complexities of Hadoop MapReduce algorithms here are O(n). The SVM methodology produces a list of data sets for deletion along with their respective sizes. This methodology was also compared with a heuristic called Retention Cost which was calculated using size of the data set and the time since its last access to help decide how useful a data set is. Accuracies of both were compared by calculating the percentage of data sets predicted for deletion which were accessed at a later instance of time. Our methodology using SVMs proved to be more accurate than using the Retention Cost heuristic. This methodology could be used to solve similar problems involving other large data sets.
Resumo:
Objectives: To determine the micro-hardness profile of two dual cure resin cements (RelyX - U100 (R), 3M-ESPE and Panavia F 2.0 (R), Kuraray) used for cementing fiber-reinforced resin posts (Fibrekor (R) - Jeneric Pentron) under three different curing protocols and two water storage times. Material and methods: Sixty 16mm long bovine incisor roots were endodontically treated and prepared for cementation of the Fibrekor posts. The cements were mixed as instructed, dispensed in the canal, the posts were seated and the curing performed as follows: a) no light activation; b) light-activation immediately after seating the post, and; c) light-activation delayed 5 minutes after seating the post. The teeth were stored in water and retrieved for analysis after 7 days and 3 months. The roots were longitudinally sectioned and the microhardness was determined at the cervical, middle and apical regions along the cement line. The data was analyzed by the three-way ANOVA test (curing mode, storage time and thirds) for each cement. The Tukey test was used for the post-hoc analysis. Results: Light-activation resulted in a significant increase in the microhardness. This was more evident for the cervical region and for the Panavia cement. Storage in water for 3 months caused a reduction of the micro-hardness for both cements. The U100 cement showed less variation in the micro-hardness regardless of the curing protocol and storage time. Conclusions: The micro-hardness of the cements was affected by the curing and storage variables and were material-dependent.
Resumo:
Spatial data warehouses (SDWs) allow for spatial analysis together with analytical multidimensional queries over huge volumes of data. The challenge is to retrieve data related to ad hoc spatial query windows according to spatial predicates, avoiding the high cost of joining large tables. Therefore, mechanisms to provide efficient query processing over SDWs are essential. In this paper, we propose two efficient indices for SDW: the SB-index and the HSB-index. The proposed indices share the following characteristics. They enable multidimensional queries with spatial predicate for SDW and also support predefined spatial hierarchies. Furthermore, they compute the spatial predicate and transform it into a conventional one, which can be evaluated together with other conventional predicates by accessing a star-join Bitmap index. While the SB-index has a sequential data structure, the HSB-index uses a hierarchical data structure to enable spatial objects clustering and a specialized buffer-pool to decrease the number of disk accesses. The advantages of the SB-index and the HSB-index over the DBMS resources for SDW indexing (i.e. star-join computation and materialized views) were investigated through performance tests, which issued roll-up operations extended with containment and intersection range queries. The performance results showed that improvements ranged from 68% up to 99% over both the star-join computation and the materialized view. Furthermore, the proposed indices proved to be very compact, adding only less than 1% to the storage requirements. Therefore, both the SB-index and the HSB-index are excellent choices for SDW indexing. Choosing between the SB-index and the HSB-index mainly depends on the query selectivity of spatial predicates. While low query selectivity benefits the HSB-index, the SB-index provides better performance for higher query selectivity.
Resumo:
Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.
Resumo:
The aims of this study were to test (i) the effect of time of tissue and RNA extracts storage on ice and (ii) the effect of repeated freeze–thaw cycles on RNA integrity and gene expression of bovine reproductive tissues. Fragments of endometrium (ENDO), corpus luteum (CL) and ampulla (AMP) were subdivided and incubated for 0, 1, 3, 6, 12 or 24 h on ice. RNA extracts were incubated on ice for 0, 3, 12 or 24 h, or exposed to 1, 2, 4 or 6 freeze–thaw cycles. RNA integrity number (RIN) was estimated. Expression of progesterone receptor (PGR) and cyclophilin genes from RNA extracts stored on ice for 0 or 24 h, and 1 or 6 freeze–thaw cycles was measured by qPCR. Tissue and RNA extract incubation on ice, and repeated freeze–thaw cycles did not affect RIN values of RNA from ENDO, CL or AMP. Storage on ice or exposure to freeze–thaw cycles did not affect Cq values for PGR or cyclophilin genes. In conclusion, neither generalized RNA degradation nor specific RNA degradation was affected by storage of tissue or RNA extracts on ice for up to 24 h, or by up to 6 freeze–thaw cycles of RNA extracts obtained from bovine ENDO, CL and AMP.
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.