972 resultados para Automated sorting system
Resumo:
A novel design based on electric field-free open microwell arrays for the automated continuous-flow sorting of single or small clusters of cells is presented. The main feature of the proposed device is the parallel analysis of cell-cell and cell-particle interactions in each microwell of the array. High throughput sample recovery with a fast and separate transfer from the microsites to standard microtiter plates is also possible thanks to the flexible printed circuit board technology which permits to produce cost effective large area arrays featuring geometries compatible with laboratory equipment. The particle isolation is performed via negative dielectrophoretic forces which convey the particles’ into the microwells. Particles such as cells and beads flow in electrically active microchannels on whose substrate the electrodes are patterned. The introduction of particles within the microwells is automatically performed by generating the required feedback signal by a microscope-based optical counting and detection routine. In order to isolate a controlled number of particles we created two particular configurations of the electric field within the structure. The first one permits their isolation whereas the second one creates a net force which repels the particles from the microwell entrance. To increase the parallelism at which the cell-isolation function is implemented, a new technique based on coplanar electrodes to detect particle presence was implemented. A lock-in amplifying scheme was used to monitor the impedance of the channel perturbed by flowing particles in high-conductivity suspension mediums. The impedance measurement module was also combined with the dielectrophoretic focusing stage situated upstream of the measurement stage, to limit the measured signal amplitude dispersion due to the particles position variation within the microchannel. In conclusion, the designed system complies with the initial specifications making it suitable for cellomics and biotechnology applications.
Resumo:
A first phase of the research activity has been related to the study of the state of art of the infrastructures for cycling, bicycle use and methods for evaluation. In this part, the candidate has studied the "bicycle system" in countries with high bicycle use and in particular in the Netherlands. Has been carried out an evaluation of the questionnaires of the survey conducted within the European project BICY on mobility in general in 13 cities of the participating countries. The questionnaire was designed, tested and implemented, and was later validated by a test in Bologna. The results were corrected with information on demographic situation and compared with official data. The cycling infrastructure analysis was conducted on the basis of information from the OpenStreetMap database. The activity consisted in programming algorithms in Python that allow to extract data from the database infrastructure for a region, to sort and filter cycling infrastructure calculating some attributes, such as the length of the arcs paths. The results obtained were compared with official data where available. The structure of the thesis is as follows: 1. Introduction: description of the state of cycling in several advanced countries, description of methods of analysis and their importance to implement appropriate policies for cycling. Supply and demand of bicycle infrastructures. 2. Survey on mobility: it gives details of the investigation developed and the method of evaluation. The results obtained are presented and compared with official data. 3. Analysis cycling infrastructure based on information from the database of OpenStreetMap: describes the methods and algorithms developed during the PhD. The results obtained by the algorithms are compared with official data. 4. Discussion: The above results are discussed and compared. In particular the cycle demand is compared with the length of cycle networks within a city. 5. Conclusions
Resumo:
The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.
Resumo:
Flüchtige organische Bestandteile (engl.: VOC) sind in der Atmosphäre in Spuren vorhanden, spielen aber trotzdem eine wichtige Rolle in der Luftchemie: sie beeinflussen das Ozon der Troposphäre, städtischen Smog, Oxidationskapazität und haben direkte und indirekte Auswirkungen auf die globale Klimaveränderung. Eine wichtige Klasse der VOC sind die Nicht-Methan-Kohlenwasserstoffe (engl.: NMHC), die überwiegend von anthropogenen Quellen kommen. Aus diesem Grund ist für Luftchemiker ein Messinstrument nötig, das die VOC, die NMHC eingeschlossen, mit einer höheren Zeitauflösung misst, besonders für Echtzeitmessungen an Bord eines Forschungsflugzeuges. Dafür wurde das System zur schnellen Beobachtung von organischen Spuren (engl.: FOTOS) entworfen, gebaut für den Einsatz in einem neuen Wissenschaftlichen Flugzeug, das in großen Höhen und über weite Strecken fliegt, genannt HALO. In der Folge wurde FOTOS in zwei Messkampagnen am Boden getestet. FOTOS wurde entworfen und gebaut mit einem speziell angefertigten, automatisierten, kryogenen Probensystem mit drei Fallen und einem angepassten, erworbenen schnellen GC-MS. Ziel dieses Aufbaus war es, die Vielseitigkeit zu vergrößern und das Störungspotential zu verringern, deshalb wurden keine chemischen Trocknungsmittel oder adsorbierenden Stoffe verwendet. FOTOS erreichte eine Probenfrequenz von 5.5 Minuten, während es mindestens 13 verschiedene C2- bis C5-NMHC maß. Die Drei-Sigma-Detektionsgrenze für n- und iso-Pentan wurde als 2.6 und 2.0 pptv ermittelt, in dieser Reihenfolge. Labortests bestätigten, dass FOTOS ein vielseitiges, robustes, hochautomatisiertes, präzises, genaues, empfindliches Instrument ist, geeignet für Echtzeitmessungen von VOC in Probenfrequenzen, die angemessen sind für ein Forschungsflugzeug wie HALO. Um die Leistung von FOTOS zu bestätigen, wurde vom 26. Januar bis 4. Februar 2010 ein Zwischenvergleich gemacht mit dem GC-FID-System am Meteorologischen Observatorium Hohenpeißenberg, einer WMO-GAW-globalen Station. Dreizehn verschiedene NMHC wurden innerhalb des Rahmens der GWA Data Quality Objectives (DQO) analysiert und verglichen. Mehr als 80% der Messungen von sechs C3- bis C5-NMHC erfüllten diese DQO. Diese erste Messkampagne im Feld hob die Robustheit und Messgenauigkeit von FOTOS hervor, zusätzlich zu dem Vorteil der höheren Probenfrequenz, sogar in einer Messung am Boden. Um die Möglichkeiten dieses Instrumentes im Feld zu zeigen, maß FOTOS ausgewählte leichte NMHC während einer Messkampagne im Borealen Waldgebiet, HUMPPA-COPEC 2010. Vom 12. Juli bis zum 12. August 2010 beteiligte sich eine internationale Gruppe von Instituten und Instrumenten an Messungen physikalischer und chemischer Größen der Gas- und Partikelphasen der Luft über dem Borealen Wald an der SMEAR II-Station nahe Hyyttiälä, Finnland. Es wurden mehrere Hauptpunkte von Interesse im Mischungsverhältnis der Alkane und im Isomerenverhätnis von Pentan identifiziert, insbesondere sehr unterschiedliche Perioden niedriger und hoher Variabilität, drei Rauchschwaden von Biomassen-Verbrennung von russischen Waldbränden und zwei Tage mit extrem sauberer Luft aus der Polarregion. Vergleiche der NMHC mit anderen anthropogenen Indikatoren zeigten mehrere Quellen anthropogener Einflüsse am Ort auf und erlaubten eine Unterscheidung zwischen lokalen und weiter entfernten Quellen. Auf einen minimalen natürlichen Beitrag zum 24h-Kreislauf von NOx wurde geschlussfolgert aus der Korrelation von NOx mit Alkanen. Altersschätzungen der Luftmassen durch das Isomerenverhältnis von Pentan wurden erschwert durch sich verändernde Verhältnisse der Quellen und durch Besonderheiten der Photochemie während des Sommers im hohen Norden. Diese Messungen zeigten den Wert des Messens leichter NMHC, selbst in abgelegenen Regionen, als einen zusätzlichen spezifischen Marker von anthropogenem Einfluss.
Resumo:
Plasmonen sind die kollektive resonante Anregung von Leitungselektronen. Vom Licht angeregternPlasmonen in subwellenlängen-grossen Nanopartikeln heissen Partikelplasmonen und sind vielversprechende Kandidaten für zukünftige Mikrosensoren wegen der starken Abhängigkeit der Resonanz an extern steuerbaren Parametern, wie die optischen Eigenschaften des umgebenden Mediums und die elektrische Ladung der Nanopartikel. Die extrem hohe Streue_zienz von Partikelplasmonen erlaubt eine einfache Beobachtung einzelner Nanopartikel in einem Mikroskop.rnDie Anforderung, schnell eine statistisch relevante Anzahl von Datenpunkten sammeln zu können,rnund die wachsende Bedeutung von plasmonischen (vor allem Gold-) Nanopartikeln für Anwendungenrnin der Medizin, hat nach der Entwicklung von automatisierten Mikroskopen gedrängt, die im bis dahin nur teilweise abgedeckten spektralen Fenster der biologischen Gewebe (biologisches Fenster) von 650 bis 900nm messen können. Ich stelle in dieser Arbeit das Plasmoscope vor, das genau unter Beobachtung der genannten Anforderungen entworfen wurde, in dem (1) ein einstellbarer Spalt in die Eingangsö_nung des Spektrometers, die mit der Bildebene des Mikroskops zusammenfällt, gesetzt wurde, und (2) einem Piezo Scantisch, der es ermöglicht, die Probe durch diesen schmalen Spalt abzurastern. Diese Verwirklichung vermeidet optische Elemente, die im nahen Infra-Rot absorbieren.rnMit dem Plasmoscope untersuche ich die plasmonische Sensitivität von Gold- und Silbernanostrnäbchen, d.h. die Plasmon-Resonanzverschiebung in Abhängigkeit mit der Änderung des umgebendenrnMediums. Die Sensitivität ist das Mass dafür, wie gut die Nanopartikeln Materialänderungenrnin ihrer Umgebung detektieren können, und damit ist es immens wichtig zu wissen, welche Parameterrndie Sensitivität beein_ussen. Ich zeige hier, dass Silbernanostäbchen eine höhere Sensitivität alsrnGoldnanostäbchen innerhalb des biologischen Fensters besitzen, und darüberhinaus, dass die Sensitivität mit der Dicke der Stäbchen wächst. Ich stelle eine theoretische Diskussion der Sensitivitätrnvor, indenti_ziere die Materialparameter, die die Sensitivität bein_ussen und leite die entsprechendenrnFormeln her. In einer weiteren Annäherung präsentiere ich experimentelle Daten, die die theoretische Erkenntnis unterstützen, dass für Sensitivitätsmessschemata, die auch die Linienbreite mitberücksichtigen, Goldnanostäbchen mit einem Aspektverhältnis von 3 bis 4 das optimalste Ergebnis liefern. Verlässliche Sensoren müssen eine robuste Wiederholbarkeit aufweisen, die ich mit Gold- und Silbernanostäbchen untersuche.rnDie Plasmonen-resonanzwellenlänge hängt von folgenden intrinsischen Materialparametern ab:rnElektrondichte, Hintergrundpolarisierbarkeit und Relaxationszeit. Basierend auf meinen experimentellen Ergebnissen zeige ich, dass Nanostäbchen aus Kupfer-Gold-Legierung im Vergleich zu ähnlich geformten Goldnanostäbchen eine rotverschobene Resonanz haben, und in welcher Weiserndie Linienbreite mit der stochimetrischen Zusammensetzung der legierten Nanopartikeln variiert.rnDie Abhängigkeit der Linienbreite von der Materialzusammensetzung wird auch anhand von silberbeschichteten und unbeschichteten Goldnanostäbchen untersucht.rnHalbleiternanopartikeln sind Kandidaten für e_ziente photovoltaische Einrichtungen. Die Energieumwandlung erfordert eine Ladungstrennung, die mit dem Plasmoscope experimentell vermessen wird, in dem ich die lichtinduzierte Wachstumsdynamik von Goldsphären auf Halbleiternanost äbchen in einer Goldionenlösung durch die Messung der gestreuten Intensität verfolge.rn
Resumo:
The full blood cell (FBC) count is the most common indicator of diseases. At present hematology analyzers are used for the blood cell characterization, but, recently, there has been interest in using techniques that take advantage of microscale devices and intrinsic properties of cells for increased automation and decreased cost. Microfluidic technologies offer solutions to handling and processing small volumes of blood (2-50 uL taken by finger prick) for point-of-care(PoC) applications. Several PoC blood analyzers are in use and may have applications in the fields of telemedicine, out patient monitoring and medical care in resource limited settings. They have the advantage to be easy to move and much cheaper than traditional analyzers, which require bulky instruments and consume large amount of reagents. The development of miniaturized point-of-care diagnostic tests may be enabled by chip-based technologies for cell separation and sorting. Many current diagnostic tests depend on fractionated blood components: plasma, red blood cells (RBCs), white blood cells (WBCs), and platelets. Specifically, white blood cell differentiation and counting provide valuable information for diagnostic purposes. For example, a low number of WBCs, called leukopenia, may be an indicator of bone marrow deficiency or failure, collagen- vascular diseases, disease of the liver or spleen. The leukocytosis, a high number of WBCs, may be due to anemia, infectious diseases, leukemia or tissue damage. In the laboratory of hybrid biodevices, at the University of Southampton,it was developed a functioning micro impedance cytometer technology for WBC differentiation and counting. It is capable to classify cells and particles on the base of their dielectric properties, in addition to their size, without the need of labeling, in a flow format similar to that of a traditional flow cytometer. It was demonstrated that the micro impedance cytometer system can detect and differentiate monocytes, neutrophils and lymphocytes, which are the three major human leukocyte populations. The simplicity and portability of the microfluidic impedance chip offer a range of potential applications in cell analysis including point-of-care diagnostic systems. The microfluidic device has been integrated into a sample preparation cartridge that semi-automatically performs erythrocyte lysis before leukocyte analysis. Generally erythrocytes are manually lysed according to a specific chemical lysis protocol, but this process has been automated in the cartridge. In this research work the chemical lysis protocol, defined in the patent US 5155044 A, was optimized in order to improve white blood cell differentiation and count performed by the integrated cartridge.
Resumo:
Fine powders commonly have poor flowability and dispersibility due to interparticle adhesion that leads to formation of agglomerates. Knowing about adhesion in particle collectives is indispensable to gain a deeper fundamental understanding of particle behavior in powders. Especially in pharmaceutical industry a control of adhesion forces in powders is mandatory to improve the performance of inhalation products. Typically the size of inhalable particles is in the range of 1 - 5 µm. In this thesis, a new method was developed to measure adhesion forces of particles as an alternative to the established colloidal probe and centrifuge technique, which are both experimentally demanding, time consuming and of limited practical applicability. The new method is based on detachment of individual particles from a surface due to their inertia. The required acceleration in the order of 500 000 g is provided by a Hopkinson bar shock excitation system and measured via laser vibrometry. Particle detachment events are detected on-line by optical video microscopy. Subsequent automated data evaluation allows obtaining a statistical distribution of particle adhesion forces. To validate the new method, adhesion forces for ensembles of single polystyrene and silica microspheres on a polystyrene coated steel surface were measured under ambient conditions. It was possible to investigate more than 150 individual particles in one experiment and obtain adhesion values of particles in a diameter range of 3 - 13 µm. This enables a statistical evaluation while measuring effort and time are considerably lower compared to the established techniques. Measured adhesion forces of smaller particles agreed well with values from colloidal probe measurements and theoretical predictions. However, for the larger particles a stronger increase of adhesion with diameter was observed. This discrepancy might be induced by surface roughness and heterogeneity that influence small and large particles differently. By measuring adhesion forces of corrugated dextran particles with sizes down to 2 µm it was demonstrated that the Hopkinson bar method can be used to characterize more complex sample systems as well. Thus, the new device will be applicable to study a broad variety of different particle-surface combinations on a routine basis, including strongly cohesive powders like pharmaceutical drugs for inhalation.
Resumo:
The multi-target screening method described in this work allows the simultaneous detection and identification of 700 drugs and metabolites in biological fluids using a hybrid triple-quadrupole linear ion trap mass spectrometer in a single analytical run. After standardization of the method, the retention times of 700 compounds were determined and transitions for each compound were selected by a "scheduled" survey MRM scan, followed by an information-dependent acquisition using the sensitive enhanced product ion scan of a Q TRAP hybrid instrument. The identification of the compounds in the samples analyzed was accomplished by searching the tandem mass spectrometry (MS/MS) spectra against the library we developed, which contains electrospray ionization-MS/MS spectra of over 1,250 compounds. The multi-target screening method together with the library was included in a software program for routine screening and quantitation to achieve automated acquisition and library searching. With the help of this software application, the time for evaluation and interpretation of the results could be drastically reduced. This new multi-target screening method has been successfully applied for the analysis of postmortem and traffic offense samples as well as proficiency testing, and complements screening with immunoassays, gas chromatography-mass spectrometry, and liquid chromatography-diode-array detection. Other possible applications are analysis in clinical toxicology (for intoxication cases), in psychiatry (antidepressants and other psychoactive drugs), and in forensic toxicology (drugs and driving, workplace drug testing, oral fluid analysis, drug-facilitated sexual assault).
Resumo:
Vertebroplasty is a minimally invasive procedure with many benefits; however, the procedure is not without risks and potential complications, of which leakage of the cement out of the vertebral body and into the surrounding tissues is one of the most serious. Cement can leak into the spinal canal, venous system, soft tissues, lungs and intradiscal space, causing serious neurological complications, tissue necrosis or pulmonary embolism. We present a method for automatic segmentation and tracking of bone cement during vertebroplasty procedures, as a first step towards developing a warning system to avoid cement leakage outside the vertebral body. We show that by using active contours based on level sets the shape of the injected cement can be accurately detected. The model has been improved for segmentation as proposed in our previous work by including a term that restricts the level set function to the vertebral body. The method has been applied to a set of real intra-operative X-ray images and the results show that the algorithm can successfully detect different shapes with blurred and not well-defined boundaries, where the classical active contours segmentation is not applicable. The method has been positively evaluated by physicians.
Resumo:
Hemisity refers to binary thinking and behavioral style differences between right and left brain-oriented individuals. The inevitability of hemisity became clear when it was discovered by magnetic resonance imaging (MRI) that an anatomical element of the executive system was unilaterally embedded in either the right or the left side of the ventral gyrus of the anterior cingulate cortex in an idiosyncratic manner that was congruent with an individual's inherent hemisity subtype. Based upon the MRI-calibrated hemisity of many individuals, a set of earlier biophysical and questionnaire hemisity assays was calibrated for accuracy and found appropriate for use in the investigation of the hemisity of individuals and groups. It had been reported that a partial sorting of individuals into hemisity right and left brain-oriented subgroups occurred during the process of higher education and professional development. Here, these results were extended by comparison of the hemisity of a putative unsorted population of 1,049 high school upper classmen, with that of 228 university freshmen. These hemisity outcomes were further compared with that of 15 university librarians, here found to be predominantly left brain-oriented, and 91 academically trained musicians, including 47 professional pianists, here found to be mostly right brainers. The results further supported the existence of substantial hemisity selection occurring during the process of higher education and in professional development.
Resumo:
Placing portal incisions during arthroscopic hip surgery presents challenges for surgeons in terms of anatomic accessibility and patient safety. Based on key anatomic landmarks and portal placement information from recent literature, suggested portal incisions were determined. Guidance in the placement of the three most common portal incision locations (anterior, anterolateral, and posterolateral) for arthroscopic surgery; in addition to visual feedback on tool trajectory to the hip joint is provided in real time by a computer aided system for hip arthroscopy. By simplifying the portal placement process, one of the most challenging aspects of arthroscopic hip surgery, an increased use of this minimally invasive technique could be possible. In addition to portal information, improvements to an existing computer aided system for arthroscopic hip surgery, including a new hip model and redesigned mechanical tracking linkage, were completed.
Resumo:
The authors describe the use of the Cardica C-Port xA Distal Anastomosis System to perform an automated, high-flow extracranial-intracranial bypass. The C-Port system has been developed and tested in coronary artery bypass surgery for rapid distal coronary artery anastomoses. Air-powered, it performs an automated end-to-side anastomosis within seconds by nearly simultaneously making an arteriotomy and inserting 13 microclips into the graft and recipient vessel. Intracranial use of the device was first simulated in a cadaver prepared for microsurgical anatomical dissection. The authors used this system in a 43-year-old man who sustained a subarachnoid hemorrhage after being assaulted and was found to have a traumatic pseudoaneurysm of the proximal intracranial internal carotid artery. The aneurysm appeared to be enlarging on serial imaging studies and it was anticipated that a bypass would probably be needed to treat the lesion. An end-to-side bypass was performed with the C-Port system using a saphenous vein conduit extending from the common carotid artery to the middle cerebral artery. The bypass was demonstrated to be patent on intraoperative and postoperative arteriography. The patient had a temporary hyperperfusion syndrome and subsequently made a good neurological recovery. The C-Port system facilitates the performance of a high-flow extracranial-intracranial bypass with short periods of temporary arterial occlusion. Because of the size and configuration of the device, its use is not feasible in all anatomical situations that require a high-flow bypass; however it is a useful addition to the armamentarium of the neurovascular surgeon.
Resumo:
Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.
Resumo:
BACKGROUND: In this paper, we present a new method for the calibration of a microscope and its registration using an active optical tracker. METHODS: Practically, both operations are done simultaneously by moving an active optical marker within the field of view of the two devices. The IR LEDs composing the marker are first segmented from the microscope images. By knowing their corresponding three-dimensional (3D) position in the optical tracker reference system, it is possible to find the transformation matrix between the referential of the two devices. Registration and calibration parameters can be extracted directly from that transformation. In addition, since the zoom and focus can be modified by the surgeon during the operation, we propose a spline based method to update the camera model to the new setup. RESULTS: The proposed technique is currently being used in an augmented reality system for image-guided surgery in the fields of ear, nose and throat (ENT) and craniomaxillofacial surgeries. CONCLUSIONS: The results have proved to be accurate and the technique is a fast, dynamic and reliable way to calibrate and register the two devices in an OR environment.
Resumo:
This study develops an automated analysis tool by combining total internal reflection fluorescence microscopy (TIRFM), an evanescent wave microscopic imaging technique to capture time-sequential images and the corresponding image processing Matlab code to identify movements of single individual particles. The developed code will enable us to examine two dimensional hindered tangential Brownian motion of nanoparticles with a sub-pixel resolution (nanoscale). The measured mean square displacements of nanoparticles are compared with theoretical predictions to estimate particle diameters and fluid viscosity using a nonlinear regression technique. These estimated values will be confirmed by the diameters and viscosities given by manufacturers to validate this analysis tool. Nano-particles used in these experiments are yellow-green polystyrene fluorescent nanospheres (200 nm, 500 nm and 1000 nm in diameter (nominal); 505 nm excitation and 515 nm emission wavelengths). Solutions used in this experiment are de-ionized (DI) water, 10% d-glucose and 10% glycerol. Mean square displacements obtained near the surface shows significant deviation from theoretical predictions which are attributed to DLVO forces in the region but it conforms to theoretical predictions after ~125 nm onwards. The proposed automation analysis tool will be powerfully employed in the bio-application fields needed for examination of single protein (DNA and/or vesicle) tracking, drug delivery, and cyto-toxicity unlike the traditional measurement techniques that require fixing the cells. Furthermore, this tool can be also usefully applied for the microfluidic areas of non-invasive thermometry, particle tracking velocimetry (PTV), and non-invasive viscometry.