965 resultados para parallel optical data storage
Resumo:
For the purpose of research a large quantity of anti-measles IgG working reference serum was needed. A pool of sera from five teenagers was prepared and named Alexandre Herculano (AH). In order to calibrate the AH serum, 18 EIA assays were performed testing in parallel AH and the 2nd International Standard 1990, Anti-Measles Antibody, 66/202 (IS) in a range of dilutions (from 1/50 to 1/25600). A method which compared parallel lines resulting from the graphic representation of the results of laboratory tests was used to estimate the power of AH relative to IS. A computer programme written by one of the authors was used to analyze the data and make potency estimates. Another method of analysis was used, comparing logistic curves relating serum concentrations with optical density by EIA. For that purpose an existing computer programme (WRANL) was used. The potency of AH relative to IS, by either method, was estimated to be 2.4. As IS has 5000 milli international units (mIU) of anti-measles IgG per millilitre (ml), we concluded that AH has 12000 mIU/ml.
Resumo:
Harnessing idle PCs CPU cycles, storage space and other resources of networked computers to collaborative are mainly fixated on for all major grid computing research projects. Most of the university computers labs are occupied with the high puissant desktop PC nowadays. It is plausible to notice that most of the time machines are lying idle or wasting their computing power without utilizing in felicitous ways. However, for intricate quandaries and for analyzing astronomically immense amounts of data, sizably voluminous computational resources are required. For such quandaries, one may run the analysis algorithms in very puissant and expensive computers, which reduces the number of users that can afford such data analysis tasks. Instead of utilizing single expensive machines, distributed computing systems, offers the possibility of utilizing a set of much less expensive machines to do the same task. BOINC and Condor projects have been prosperously utilized for solving authentic scientific research works around the world at a low cost. In this work the main goal is to explore both distributed computing to implement, Condor and BOINC, and utilize their potency to harness the ideal PCs resources for the academic researchers to utilize in their research work. In this thesis, Data mining tasks have been performed in implementation of several machine learning algorithms on the distributed computing environment.
Resumo:
We are presenting a simple, low-cost and rapid solid-state optical probe for screening chlorpromazine (CPZ) in aquacultures. The method exploits the colourimetric reaction between CPZ and Fe(III) ion that occurs at a solid/liquid interface, the solid layer consisting of ferric iron entrapped in a layer of plasticized PVC. If solutions containing CPZ are dropped onto such a layer, a colour change occurs from light yellow to dark pink or even light blue, depending on the concentration of CPZ. Visual inspection enables the concentration of CPZ to be estimated. The resulting colouration was also monitored by digital image collection for a more accurate quantification. The three coordinates of the hue, saturation and lightness system were obtained by standard image processing along with mathematical data treatment. The parameters affecting colour were assessed and optimized. Studies were conducted by visible spectrophotometry and digital image acquisition, respectively. The response of the optimized probe towards the concentration of CPZ was tested for several mathematical transformations of the colour coordinates, and a linear relation was found for the sum of hue and luminosity. The limit of detection is 50 μM (corresponding to about 16 μg per mL). The probe enables quick screening for CPZ in real water samples with prior sample treatment.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Adhesive bonding is an excellent alternative to traditional joining techniques such as welding, mechanical fastening or riveting. However, there are many factors that have to be accounted for during joint design to accurately predict the joint strength. One of these is the adhesive layer thickness (tA). Most of the results are for epoxy structural adhesives, tailored to perform best with small values of tA, and these show that the lap joint strength decreases with increase of tA (the optimum joint strength is usually obtained with tA values between 0.1 and 0.2 mm). Recently, polyurethane adhesives were made available in the market, designed to perform with larger tA values, and whose fracture behaviour is still not studied. In this work, the effect of tA on the tensile fracture toughness (View the MathML source) of a bonded joint is studied, considering a novel high strength and ductile polyurethane adhesive for the automotive industry. This work consists on the fracture characterization of the bond by a conventional and the J-integral techniques, which accurately account for root rotation effects. An optical measurement method is used for the evaluation of crack tip opening (δn) and adherends rotation at the crack tip (θo) during the test, supported by a Matlab® sub-routine for the automated extraction of these parameters. As output of this work, fracture data is provided in traction for the selected adhesive, enabling the subsequent strength prediction of bonded joints.
Resumo:
The study of chemical diffusion in biological tissues is a research field of high importance and with application in many clinical, research and industrial areas. The evaluation of diffusion and viscosity properties of chemicals in tissues is necessary to characterize treatments or inclusion of preservatives in tissues or organs for low temperature conservation. Recently, we have demonstrated experimentally that the diffusion properties and dynamic viscosity of sugars and alcohols can be evaluated from optical measurements. Our studies were performed in skeletal muscle, but our results have revealed that the same methodology can be used with other tissues and different chemicals. Considering the significant number of studies that can be made with this method, it becomes necessary to turn data processing and calculation easier. With this objective, we have developed a software application that integrates all processing and calculations, turning the researcher work easier and faster. Using the same experimental data that previously was used to estimate the diffusion and viscosity of glucose in skeletal muscle, we have repeated the calculations with the new application. Comparing between the results obtained with the new application and with previous independent routines we have demonstrated great similarity and consequently validated the application. This new tool is now available to be used in similar research to obtain the diffusion properties of other chemicals in different tissues or organs.
Resumo:
Dissertation presented to obtain the PhD degree in Electrical and Computer Engineering - Electronics
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Optical immersion clearing is a technique that has been widely studied for more than two decades and that is used to originate a temporary transparency effect in biological tissues. If applied in cooperation with clinical methods it provides optimization of diagnosis and treatment procedures. This technique turns biological tissues more transparent through two main mechanisms — tissue dehydration and refractive index (RI) matching between tissue components. Such matching is obtained by partial replacement of interstitial water by a biocompatible agent that presents higher RI and it can be completely reversible by natural rehydration in vivo or by assisted rehydration in ex vivo tissues. Experimental data to characterize and discriminate between the two mechanisms and to find new ones are necessary. Using a simple method, based on collimated transmittance and thickness measurements made from muscle samples under treatment, we have estimated the diffusion properties of glucose, ethylene glycol (EG) and water that were used to perform such characterization and discrimination. Comparing these properties with data from literature that characterize their diffusion in water we have observed that muscle cell membrane permeability limits agent and water diffusion in the muscle. The same experimental data has allowed to calculate the optical clearing (OC) efficiency and make an interpretation of the internal changes that occurred in muscle during the treatments. The same methodology can now be used to perform similar studies with other agents and in other tissues in order to solve engineering problems at design of inexpensive and robust technologies for a considerable improvement of optical tomographic techniques with better contrast and in-depth imaging.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Química e Bioquímica
Resumo:
Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Breast cancer is the most common cancer among women, being a major public health problem. Worldwide, X-ray mammography is the current gold-standard for medical imaging of breast cancer. However, it has associated some well-known limitations. The false-negative rates, up to 66% in symptomatic women, and the false-positive rates, up to 60%, are a continued source of concern and debate. These drawbacks prompt the development of other imaging techniques for breast cancer detection, in which Digital Breast Tomosynthesis (DBT) is included. DBT is a 3D radiographic technique that reduces the obscuring effect of tissue overlap and appears to address both issues of false-negative and false-positive rates. The 3D images in DBT are only achieved through image reconstruction methods. These methods play an important role in a clinical setting since there is a need to implement a reconstruction process that is both accurate and fast. This dissertation deals with the optimization of iterative algorithms, with parallel computing through an implementation on Graphics Processing Units (GPUs) to make the 3D reconstruction faster using Compute Unified Device Architecture (CUDA). Iterative algorithms have shown to produce the highest quality DBT images, but since they are computationally intensive, their clinical use is currently rejected. These algorithms have the potential to reduce patient dose in DBT scans. A method of integrating CUDA in Interactive Data Language (IDL) is proposed in order to accelerate the DBT image reconstructions. This method has never been attempted before for DBT. In this work the system matrix calculation, the most computationally expensive part of iterative algorithms, is accelerated. A speedup of 1.6 is achieved proving the fact that GPUs can accelerate the IDL implementation.