117 resultados para Microcomputer
Resumo:
This work aims at presenting a no-break system for microcomputers using ultracapacitors in replacement of the conventional chemical batteries. We analyzed the most relevant data about average power consumption of microcomputers, electrical and mechanical characteristics of ultracapacitors and operation of no-break power circuits, to propose a configuration capable of working properly with a microcomputer switching mode power supply. Our solution was a sixteen-component ultracapacitor bank, with a total capacitance of 350 F and voltage of 10.8 V, adequate to integrate a low-capacity no-break system, capable of feeding a load of 180 Wh, during 75 s. Our proposed no-break increases the reliability of microcomputers by reducing the probability of user data losses, in case of a power grid failure, offering, so, a high benefit-cost ratio. The replacement of the battery by ultracapacitors allows a quick no-break recharge and low maintenance costs, since these modern components have a lifetime longer than the batteries. Moreover, this solution reduces the environmental impact and eliminates the constant recharge of the energy storage device.
Resumo:
Pós-graduação em Odontologia Restauradora - ICT
Resumo:
The objective of this work is to analyze the viability of incorporation in a microcomputer box of a nobreak with an ultracapacitor as energy storage device, substituting the conventional chemical battery. An advantage of this inclusion is cost reduction because a specific metallic or plastic frame won’t be necessary to protect the components of the nobreak; the microcomputer metallic frame offers the necessary protection for both equipments. Moreover, a large quantity of internal space of microcomputers box isn’t used, and is possible to use it to wrap up the nobreak. This work uses data about average power consumption of microcomputers; operation of switching mode power supplies for microcomputers; electrical and mechanical characteristics of ultracapacitors and operation of power circuits of nobreaks, with the purpose of present a study of energy storage capacity that an ultracapacitor should have to allow a safe switching off of a microcomputer in case of electrical network fail. It was noticed that the use of ultracapacitors is feasible to feed an 180 W load for 75 s, using a capacitive bank with sixteen ultracapacitors, with a total capacitance of 350 F and voltage of 10,8 V. The use of the proposed nobreak increases the reliability of the microcomputer by reducing the probability of user data losses in case of an electrical network fail, offering a high cost/benefit product. The substitution of the battery by an ultracapacitor allows a quick nobreak recharge, with low maintenance costs, since ultracapacitors have a lifetime bigger than batteries; beyond reducing the environmental impact, because they don’t use potentially toxic chemical compounds
Resumo:
Food restriction reduces body weight and influence bone mass and also is correlated with bone mineral density (BMD). Mechanisms have been proposed for the loss of BMD after body weight reduction, including reduced energy intake. Growing 8 wk-old Wistar male rats were randomly divided into Control and Calorie restriction associated with sucrose 30% (CRS). These animals were subjected to intermittent food restriction during 8 weeks and had free access to tap water and sucrose30% in distilled water. The rats were euthanized at the end of week 8, blood collected from abdominal aorta artery, femurs cleaned of adherent soft tissues, scanned using dual energy X-ray absorptiometry, structural and material properties determined by three-point bending testing in the mid-diaphyseal region, bone surface tested in a microhardness tester and microstructure was assessed in a microcomputer tomography. In CRS animals body weight decreased significantly relative to the Control animals. There was a clear option for high-sucrose beverage in CRS animals. No difference was observed in biochemical, densitometric and biomechanical analyzes. Results from micro CT showed only significant difference in connectivity of trabecular bone. It has been suggested that rats submitted to food restriction consumed sugar not because of its inherent palatability, but in order to alter their macronutrient balance and animals need to meet energy demands in high-sucrose.
Resumo:
Tridacnid clams are conspicuous inhabitants of Indo-Pacific coral reefs and are traded and cultivated for the aquarium and food industries. In the present study, daily growth rates of larvae of the giant clam Tridacna crocea were determined in the laboratory during the first week of life. Adults were induced to spawn via intra-gonadal serotonin injection through the byssal orifice. After spawning oocytes were collected, fertilized and kept in 3 L glass beakers and raceways treated with antibiotics to avoid culture contamination. Larvae were fed twice with the microalga Isochrysis galbana and zooxanthellae were also offered twice during the veliger stage (days 4 and 6). Larval length was measured using a digitizing tablet coupled to a microcomputer. Larval mortality was exponential during the first 48 hours of life declining significantly afterwards. Mean growth rate was 11.3 mu m day-1, increasing after addition of symbionts to 18.0 mu m day-1. Survival increased to ca. 75% after the addition of zooxanthellae. The results describe the growth curve for T. crocea larvae and suggest that the acquisition of symbionts by larvae may be useful for larval growth and survival even before larvae have attained metamorphosis.
Resumo:
Tridacnid clams are conspicuous inhabitants of Indo-Pacific coral reefs and are traded and cultivated for the aquarium and food industries. In the present study, daily growth rates of larvae of the giant clam Tridacna crocea were determined in the laboratory during the first week of life. Adults were induced to spawn via intra-gonadal serotonin injection through the byssal orifice. After spawning oocytes were collected, fertilized and kept in 3 L glass beakers and raceways treated with antibiotics to avoid culture contamination. Larvae were fed twice with the microalga Isochrysis galbana and zooxanthellae were also offered twice during the veliger stage (days 4 and 6). Larval length was measured using a digitizing tablet coupled to a microcomputer. Larval mortality was exponential during the first 48 hours of life declining significantly afterwards. Mean growth rate was 11.3 μm day-1, increasing after addition of symbionts to 18.0 μm day-1. Survival increased to ca. 75% after the addition of zooxanthellae. The results describe the growth curve for T. crocea larvae and suggest that the acquisition of symbionts by larvae may be useful for larval growth and survival even before larvae have attained metamorphosis.
Resumo:
Methodological approaches in which data on nonverbal behavior are collected usually involve interpretive methods in which raters must identify a set of defined categories of behavior. However, present knowledge about the qualitative aspects of head movement behavior calls for recording detailed transcriptions of behavior. These records are a prerequisite for investigating the function and meaning of head movement patterns. A method for directly collecting data on head movement behavior is introduced. Using small ultrasonic transducers, which are attached to various parts of an index person's body (head and shoulders), a microcomputer defines receiver-transducers distances. Three-dimensional positions are calculated by triangulation. These data are used for further calculations concerning the angular orientation of the head and the direction, size, and speed of head movements (in rotational, lateral, and sagittal dimensions).
Resumo:
The dataset provides detailed information on the study that was conducted in Lahore's 7 major towns. The sample was taken from 472 tubewells and analyzed for major cations and anions using APHA 2012 techniques as explained herein. Besides, E.coli determination was done to check for microbial contamination. The data includes results from PHREEQC modeling of As(III)/ As(V) species and saturation indices as well as Aquachem's computed hydrochemical water facies. The WHO (2011) and EPA standards included in Aquachem identified the parameters that where in violation. Bicarbonates dominated the groundwater types with 50.21% of the samples exceeding the EPA maximum permissible limit of 250 mg/L in drinking water. Similarly, 30.51% of the samples had TDS values greater than 500 mg/L while 85.38 % of the samples exceed 10 µg/L threshold limit value of arsenic. Also, instances of high magnesium hazard values were observed which requires constant assessment if the groundwater is used for irrigation. Higher than 50% MH values are detrimental to crops which may reduce the expected yields. The membrane filtration technique using m-Endo Agar indicated that 3.59% samples had TNC (too numerous to count) values for E.coli while 5.06% showed values higher than 0 cfu/ 100 ml acceptable value in drinking water. Any traces of E-coli in a groundwater sample indicate recent fecal contamination. Such outcomes signify presence of enteric pathogens. If the groundwater is not properly dosed with disinfectants it may cause harm to human health. It is concluded that more studies are needed and proper groundwater management implement to safeguard the lives of communities that depend solely on groundwater in the city.
Resumo:
Los sistemas microinformáticos se componen principalmente de hardware y software, con el paso del tiempo el hardware se degrada, se deteriora y en ocasiones se avería. El software evoluciona, requiere un mantenimiento, de actualización y en ocasiones falla teniendo que ser reparado o reinstalado. A nivel hardware se analizan los principales componentes que integran y que son comunes en gran parte estos sistemas, tanto en equipos de sobre mesa como portátiles, independientes del sistema operativo, además de los principales periféricos, también se analizan y recomiendan algunas herramientas necesarias para realizar el montaje, mantenimiento y reparación de estos equipos. Los principales componentes hardware internos son la placa base, memoria RAM, procesador, disco duro, carcasa, fuente de alimentación y tarjeta gráfica. Los periféricos más destacados son el monitor, teclado, ratón, impresora y escáner. Se ha incluido un apartado donde se detallan los distintos tipos de BIOS y los principales parámetros de configuración. Para todos estos componentes, tanto internos como periféricos, se ha realizado un análisis de las características que ofrecen y los detalles en los que se debe prestar especial atención en el momento de seleccionar uno frente a otro. En los casos que existen diferentes tecnologías se ha hecho una comparativa entre ambas, destacando las ventajas y los inconvenientes de unas frente a otras para que sea el usuario final quien decida cual se ajusta mejor a sus necesidades en función de las prestaciones y el coste. Un ejemplo son las impresoras de inyección de tinta frente a las laser o los discos duros mecánicos en comparación con y los discos de estado sólido (SSD). Todos estos componentes están relacionados, interconectados y dependen unos de otros, se ha dedicado un capítulo exclusivamente para estudiar cómo se ensamblan estos componentes, resaltando los principales fallos que se suelen cometer o producir y se han indicado unas serie tareas de mantenimiento preventivo que se pueden realizar para prolongar la vida útil del equipo y evitar averías por mal uso. Los mantenimientos se pueden clasificar como predictivo, perfectivo, adaptativo, preventivo y correctivo. Se ha puesto el foco principalmente en dos tipos de mantenimiento, el preventivo descrito anteriormente y en el correctivo, tanto software como hardware. El mantenimiento correctivo está enfocado al análisis, localización, diagnóstico y reparación de fallos y averías hardware y software. Se describen los principales fallos que se producen en cada componente, cómo se manifiestan o qué síntomas presentan para poder realizar pruebas específicas que diagnostiquen y acoten el fallo. En los casos que es posible la reparación se detallan las instrucciones a seguir, en otro caso se recomienda la sustitución de la pieza o componente. Se ha incluido un apartado dedicado a la virtualización, una tecnología en auge que resulta muy útil para realizar pruebas de software, reduciendo tiempos y costes en las pruebas. Otro aspecto interesante de la virtualización es que se utiliza para montar diferentes servidores virtuales sobre un único servidor físico, lo cual representa un importante ahorro en hardware y costes de mantenimiento, como por ejemplo el consumo eléctrico. A nivel software se realiza un estudio detallado de los principales problemas de seguridad y vulnerabilidades a los que está expuesto un sistema microinformático enumerando y describiendo el comportamiento de los distintos tipos de elementos maliciosos que pueden infectar un equipo, las precauciones que se deben tomar para minimizar los riesgos y las utilidades que se pueden ejecutar para prevenir o limpiar un equipo en caso de infección. Los mantenimientos y asistencias técnicas, en especial las de tipo software, no siempre precisan de la atención presencial de un técnico cualificado, por ello se ha dedicado un capítulo a las herramientas de asistencia remota que se pueden utilizar en este ámbito. Se describen algunas de las más populares y utilizadas en el mercado, su funcionamiento, características y requerimientos. De esta forma el usuario puede ser atendido de una forma rápida, minimizando los tiempos de respuesta y reduciendo los costes. ABSTRACT Microcomputer systems are basically made up of pieces of hardware and software, as time pass, there’s a degradation of the hardware pieces and sometimes failures of them. The software evolves, new versions appears and requires maintenance, upgrades and sometimes also fails having to be repaired or reinstalled. The most important hardware components in a microcomputer system are analyzed in this document for a laptop or a desktop, with independency of the operating system they run. In addition to this, the main peripherals and devices are also analyzed and a recommendation about the most proper tools necessary for maintenance and repair this kind of equipment is given as well. The main internal hardware components are: motherboard, RAM memory, microprocessor, hard drive, housing box, power supply and graphics card. The most important peripherals are: monitor, keyboard, mouse, printer and scanner. A section has been also included where different types of BIOS and main settings are listed with the basic setup parameters in each case. For all these internal components and peripherals, an analysis of their features has been done. Also an indication of the details in which special attention must be payed when choosing more than one at the same time is given. In those cases where different technologies are available, a comparison among them has been done, highlighting the advantages and disadvantages of selecting one or another to guide the end user to decide which one best fits his needs in terms of performance and costs. As an example, the inkjet vs the laser printers technologies has been faced, or also the mechanical hard disks vs the new solid state drives (SSD). All these components are interconnected and are dependent one to each other, a special chapter has been included in order to study how they must be assembled, emphasizing the most often mistakes and faults that can appear during that process, indicating different tasks that can be done as preventive maintenance to enlarge the life of the equipment and to prevent damage because of a wrong use. The different maintenances can be classified as: predictive, perfective, adaptive, preventive and corrective. The main focus is on the preventive maintains, described above, and in the corrective one, in software and hardware. Corrective maintenance is focused on the analysis, localization, diagnosis and repair of hardware and software failures and breakdowns. The most typical failures that can occur are described, also how they can be detected or the specific symptoms of each one in order to apply different technics or specific tests to diagnose and delimit the failure. In those cases where the reparation is possible, instructions to do so are given, otherwise, the replacement of the component is recommended. A complete section about virtualization has also been included. Virtualization is a state of the art technology that is very useful especially for testing software purposes, reducing time and costs during the tests. Another interesting aspect of virtualization is the possibility to have different virtual servers on a single physical server, which represents a significant savings in hardware inversion and maintenance costs, such as electricity consumption. In the software area, a detailed study has been done about security problems and vulnerabilities a microcomputer system is exposed, listing and describing the behavior of different types of malicious elements that can infect a computer, the precautions to be taken to minimize the risks and the tools that can be used to prevent or clean a computer system in case of infection. The software maintenance and technical assistance not always requires the physical presence of a qualified technician to solve the possible problems, that’s why a complete chapter about the remote support tools that can be used to do so has been also included. Some of the most popular ones used in the market are described with their characteristics and requirements. Using this kind of technology, final users can be served quickly, minimizing response times and reducing costs.
Resumo:
In the last several years there has been an increase in the amount of qualitative research using in-depth interviews and comprehensive content analyses in sport psychology. However, no explicit method has been provided to deal with the large amount of unstructured data. This article provides common guidelines for organizing and interpreting unstructured data. Two main operations are suggested and discussed: first, coding meaningful text segments, or creating tags, and second, regrouping similar text segments,or creating categories. Furthermore, software programs for the microcomputer are presented as away to facilitate the organization and interpretation of qualitative data
Resumo:
Operators can become confused while diagnosing faults in process plant while in operation. This may prevent remedial actions being taken before hazardous consequences can occur. The work in this thesis proposes a method to aid plant operators in systematically finding the causes of any fault in the process plant. A computer aided fault diagnosis package has been developed for use on the widely available IBM PC compatible microcomputer. The program displays a coloured diagram of a fault tree on the VDU of the microcomputer, so that the operator can see the link between the fault and its causes. The consequences of the fault and the causes of the fault are also shown to provide a warning of what may happen if the fault is not remedied. The cause and effect data needed by the package are obtained from a hazard and operability (HAZOP) study on the process plant. The result of the HAZOP study is recorded as cause and symptom equations which are translated into a data structure and stored in the computer as a file for the package to access. Probability values are assigned to the events that constitute the basic causes of any deviation. From these probability values, the a priori probabilities of occurrence of other events are evaluated. A top-down recursive algorithm, called TDRA, for evaluating the probability of every event in a fault tree has been developed. From the a priori probabilities, the conditional probabilities of the causes of the fault are then evaluated using Bayes' conditional probability theorem. The posteriori probability values could then be used by the operators to check in an orderly manner the cause of the fault. The package has been tested using the results of a HAZOP study on a pilot distillation plant. The results from the test show how easy it is to trace the chain of events that leads to the primary cause of a fault. This method could be applied in a real process environment.
Resumo:
The objective of this study was to design, construct, commission and operate a laboratory scale gasifier system that could be used to investigate the parameters that influence the gasification process. The gasifier is of the open-core variety and is fabricated from 7.5 cm bore quartz glass tubing. Gas cleaning is by a centrifugal contacting scrubber, with the product gas being flared. The system employs an on-line dedicated gas analysis system, monitoring the levels of H2, CO, CO2 and CH4 in the product gas. The gas composition data, as well as the gas flowrate, temperatures throughout the system and pressure data is recorded using a BBC microcomputer based data-logging system. Ten runs have been performed using the system of which six were predominantly commissioning runs. The main emphasis in the commissioning runs was placed on the gas clean-up, the product gas cleaning and the reactor bed temperature measurement. The reaction was observed to occur in a narrow band, of about 3 to 5 particle diameters thick. Initially the fuel was pyrolysed, with the volatiles produced being combusted and providing the energy to drive the process, and then the char product was gasified by reaction with the pyrolysis gases. Normally, the gasifier is operated with reaction zone supported on a bed of char, although it has been operated for short periods without a char bed. At steady state the depth of char remains constant, but by adjusting the air inlet rate it has been shown that the depth of char can be increased or decreased. It has been shown that increasing the depth of the char bed effects some improvement in the product gas quality.
Resumo:
This thesis describes the design and implementation of a new dynamic simulator called DASP. It is a computer program package written in standard Fortran 77 for the dynamic analysis and simulation of chemical plants. Its main uses include the investigation of a plant's response to disturbances, the determination of the optimal ranges and sensitivities of controller settings and the simulation of the startup and shutdown of chemical plants. The design and structure of the program and a number of features incorporated into it combine to make DASP an effective tool for dynamic simulation. It is an equation-oriented dynamic simulator but the model equations describing the user's problem are generated from in-built model equation library. A combination of the structuring of the model subroutines, the concept of a unit module, and the use of the connection matrix of the problem given by the user have been exploited to achieve this objective. The Executive program has a structure similar to that of a CSSL-type simulator. DASP solves a system of differential equations coupled to nonlinear algebraic equations using an advanced mixed equation solver. The strategy used in formulating the model equations makes it possible to obtain the steady state solution of the problem using the same model equations. DASP can handle state and time events in an efficient way and this includes the modification of the flowsheet. DASP is highly portable and this has been demonstrated by running it on a number of computers with only trivial modifications. The program runs on a microcomputer with 640 kByte of memory. It is a semi-interactive program, with the bulk of all input data given in pre-prepared data files with communication with the user is via an interactive terminal. Using the features in-built in the package, the user can view or modify the values of any input data, variables and parameters in the model, and modify the structure of the flowsheet of the problem during a simulation session. The program has been demonstrated and verified using a number of example problems.
Resumo:
Investigation of the different approaches used by Expert Systems researchers to solve problems in the domain of Mechanical Design and Expert Systems was carried out. The techniques used for conventional formal logic programming were compared with those used when applying Expert Systems concepts. A literature survey of design processes was also conducted with a view to adopting a suitable model of the design process. A model, comprising a variation on two established ones, was developed and applied to a problem within what are described as class 3 design tasks. The research explored the application of these concepts to Mechanical Engineering Design problems and their implementation on a microcomputer using an Expert System building tool. It was necessary to explore the use of Expert Systems in this manner so as to bridge the gap between their use as a control structure and for detailed analytical design. The former application is well researched into and this thesis discusses the latter. Some Expert System building tools available to the author at the beginning of his work were evaluated specifically for their suitability for Mechanical Engineering design problems. Microsynics was found to be the most suitable on which to implement a design problem because of its simple but powerful Semantic Net Knowledge Representation structure and the ability to use other types of representation schemes. Two major implementations were carried out. The first involved a design program for a Helical compression spring and the second a gearpair system design. Two concepts were proposed in the thesis for the modelling and implementation of design systems involving many equations. The method proposed enables equation manipulation and analysis using a combination of frames, semantic nets and production rules. The use of semantic nets for purposes other than for psychology and natural language interpretation, is quite new and represents one of the major contributions to knowledge by the author. The development of a purpose built shell program for this type of design problems was recommended as an extension of the research. Microsynics may usefully be used as a platform for this development.
Resumo:
A mathematical model has been developed for predicting the spectral distribution of solar radiation incident on a horizontal surface. The solar spectrum in the wavelength range 0.29 to 4.0 micrometers has been divided in 144 intervals. Two variables in the model are the atmospheric water vapour content and atmospheric turbidity. After allowing for absorption and scattering in the atmosphere, the spectral intensity of direct and diffuse components of radiation are computed. When the predicted radiation levels are compared with the measured values for the total radiation and the values with glass filters RG715, RG630 and OG530, a close agreement (± 5%) has been achieved under clear sky conditions. A solar radiation measuring facility, close to the centre of Birmingham, has been set up utilising a microcomputer based data logging system. A suite of computer programs in the BASIC programming language has been developed and extensively tested for solar radiation data, logging, analysis and plotting. Two commonly used instruments, the Eppley PSP pyranometer and the Kipp and Zonen CM5 pyranometer, have been compared under different experimental conditions. Three models for computing the inclined plane irradiation, using total and diffuse radiation on a horizontal surface, have been tested for Birmingham. The anisotropic-alI-sky model, proposed by Klucher, provides a good agreement between the measured and the predicted radiation levels. Measurements of solar spectral distribution, using glass filters, are also reported for a number of inclines facing South.