884 resultados para Fully automated
Resumo:
This paper presents a robust finite element procedure for modelling the behaviour of postbuckling structures undergoing mode-jumping. Current non-linear implicit finite element solution schemes, found in most finite element codes, are discussed and their shortcomings highlighted. A more effective strategy is presented which combines a quasi-static and a pseudo-transient routine for modelling this behaviour. The switching between these two schemes is fully automated and therefore eliminates the need for user intervention during the solution process. The quasi-static response is modelled using the are-length constraint while the pseudo-transient routine uses a modified explicit dynamic routine, which is more computationally efficient than standard implicit and explicit dynamic schemes. The strategies for switching between the quasi-static and pseudo-transient routines are presented
Resumo:
The analysis of alcoholic beverages for the important carcinogenic contaminant ethyl carbamate is very time-consuming and expensive. Due to possible matrix interferences, sample cleanup using diatomaceous earth (Extrelut) column is required prior to gas chromatographic and mass spectrometric measurement. A limiting step in this process is the rotary evaporation of the eluate containing the analyte in organic solvents, which is currently conducted manually and requires approximately 20-30 min per sample. This paper introduces the use of a parallel evaporation device for ethyl carbamate analysis, which allows for the simultaneous evaporation of 12 samples to a specified residual volume without manual intervention. A more efficient and, less expensive analysis is therefore possible. The method validation showed no differences between the fully-automated parallel evaporation and the manual operation. The applicability was proven by analyzing authentic spirit samples from Germany, Canada and Brazil. It is interesting to note that Brazilian cachacas had a relatively high incidence for ethyl carbamate contamination (55% of all samples were above 0.15 mg/l), which may be of public health relevance and requires further evaluation.
Resumo:
PURPOSE Quantification of retinal layers using automated segmentation of optical coherence tomography (OCT) images allows for longitudinal studies of retinal and neurological disorders in mice. The purpose of this study was to compare the performance of automated retinal layer segmentation algorithms with data from manual segmentation in mice using the Spectralis OCT. METHODS Spectral domain OCT images from 55 mice from three different mouse strains were analyzed in total. The OCT scans from 22 C57Bl/6, 22 BALBc, and 11 C3A.Cg-Pde6b(+)Prph2(Rd2) /J mice were automatically segmented using three commercially available automated retinal segmentation algorithms and compared to manual segmentation. RESULTS Fully automated segmentation performed well in mice and showed coefficients of variation (CV) of below 5% for the total retinal volume. However, all three automated segmentation algorithms yielded much thicker total retinal thickness values compared to manual segmentation data (P < 0.0001) due to segmentation errors in the basement membrane. CONCLUSIONS Whereas the automated retinal segmentation algorithms performed well for the inner layers, the retinal pigmentation epithelium (RPE) was delineated within the sclera, leading to consistently thicker measurements of the photoreceptor layer and the total retina. TRANSLATIONAL RELEVANCE The introduction of spectral domain OCT allows for accurate imaging of the mouse retina. Exact quantification of retinal layer thicknesses in mice is important to study layers of interest under various pathological conditions.
Resumo:
We present tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray-scale curves from images at pixel resolution. The PEAK tool uses the gray-scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a smoothed curve. The zero-crossing algorithm counts every positive and negative halfway-passage of the curve through a wide moving average, separating the record into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the curve into its frequency components before counting positive and negative passages. We applied the new methods successfully to tree rings, to well-dated and already manually counted marine varves from Saanich Inlet, and to marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that laminations in Weddell Sea sites represent varves, deposited continuously over several millennia during the last glacial maximum. The new tools offer several advantages over previous methods. The counting procedures are based on a moving average generated from gray-scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Also, the PEAK tool measures the thickness of each year or season. Since all information required is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.
Resumo:
Computed tomography (CT) is a valuable technology to the healthcare enterprise as evidenced by the more than 70 million CT exams performed every year. As a result, CT has become the largest contributor to population doses amongst all medical imaging modalities that utilize man-made ionizing radiation. Acknowledging the fact that ionizing radiation poses a health risk, there exists the need to strike a balance between diagnostic benefit and radiation dose. Thus, to ensure that CT scanners are optimally used in the clinic, an understanding and characterization of image quality and radiation dose are essential.
The state-of-the-art in both image quality characterization and radiation dose estimation in CT are dependent on phantom based measurements reflective of systems and protocols. For image quality characterization, measurements are performed on inserts imbedded in static phantoms and the results are ascribed to clinical CT images. However, the key objective for image quality assessment should be its quantification in clinical images; that is the only characterization of image quality that clinically matters as it is most directly related to the actual quality of clinical images. Moreover, for dose estimation, phantom based dose metrics, such as CT dose index (CTDI) and size specific dose estimates (SSDE), are measured by the scanner and referenced as an indicator for radiation exposure. However, CTDI and SSDE are surrogates for dose, rather than dose per-se.
Currently there are several software packages that track the CTDI and SSDE associated with individual CT examinations. This is primarily the result of two causes. The first is due to bureaucracies and governments pressuring clinics and hospitals to monitor the radiation exposure to individuals in our society. The second is due to the personal concerns of patients who are curious about the health risks associated with the ionizing radiation exposure they receive as a result of their diagnostic procedures.
An idea that resonates with clinical imaging physicists is that patients come to the clinic to acquire quality images so they can receive a proper diagnosis, not to be exposed to ionizing radiation. Thus, while it is important to monitor the dose to patients undergoing CT examinations, it is equally, if not more important to monitor the image quality of the clinical images generated by the CT scanners throughout the hospital.
The purposes of the work presented in this thesis are threefold: (1) to develop and validate a fully automated technique to measure spatial resolution in clinical CT images, (2) to develop and validate a fully automated technique to measure image contrast in clinical CT images, and (3) to develop a fully automated technique to estimate radiation dose (not surrogates for dose) from a variety of clinical CT protocols.
Resumo:
This dissertation describes the development of a label-free, electrochemical immunosensing platform integrated into a low-cost microfluidic system for the sensitive, selective and accurate detection of cortisol, a steroid hormone co-related with many physiological disorders. Abnormal levels of cortisol is indicative of conditions such as Cushing’s syndrome, Addison’s disease, adrenal insufficiencies and more recently post-traumatic stress disorder (PTSD). Electrochemical detection of immuno-complex formation is utilized for the sensitive detection of Cortisol using Anti-Cortisol antibodies immobilized on sensing electrodes. Electrochemical detection techniques such as cyclic voltammetry (CV) and electrochemical impedance spectroscopy (EIS) have been utilized for the characterization and sensing of the label-free detection of Cortisol. The utilization of nanomaterial’s as the immobilizing matrix for Anti-cortisol antibodies that leads to improved sensor response has been explored. A hybrid nano-composite of Polyanaline-Ag/AgO film has been fabricated onto Au substrate using electrophoretic deposition for the preparation of electrochemical immunosening of cortisol. Using a conventional 3-electrode electrochemical cell, a linear sensing range of 1pM to 1µM at a sensitivity of 66µA/M and detection limit of 0.64pg/mL has been demonstrated for detection of cortisol. Alternately, a self-assembled monolayer (SAM) of dithiobis(succinimidylpropionte) (DTSP) has been fabricated for the modification of sensing electrode to immobilize with Anti-Cortisol antibodies. To increase the sensitivity at lower detection limit and to develop a point-of-care sensing platform, the DTSP-SAM has been fabricated on micromachined interdigitated microelectrodes (µIDE). Detection of cortisol is demonstrated at a sensitivity of 20.7µA/M and detection limit of 10pg/mL for a linear sensing range of 10pM to 200nM using the µIDE’s. A simple, low-cost microfluidic system is designed using low-temperature co-fired ceramics (LTCC) technology for the integration of the electrochemical cortisol immunosensor and automation of the immunoassay. For the first time, the non-specific adsorption of analyte on LTCC has been characterized for microfluidic applications. The design, fabrication technique and fluidic characterization of the immunoassay are presented. The DTSP-SAM based electrochemical immunosensor on µIDE is integrated into the LTCC microfluidic system and cortisol detection is achieved in the microfluidic system in a fully automated assay. The fully automated microfluidic immunosensor hold great promise for accurate, sensitive detection of cortisol in point-of-care applications.
Resumo:
Transit agencies across the world are increasingly shifting their fare collection mechanisms towards fully automated systems like the smart card. One of the objectives in implementing such a system is to reduce the boarding time per passenger and hence reduce the overall dwell time for the buses at the bus stops/bus rapid transit (BRT) stations. TransLink, the transit authority responsible for public transport management in South East Queensland, has introduced ‘GoCard’ technology using the Cubic platform for fare collection on its public transport system. In addition to this, three inner city BRT stations on South East Busway spine are operating as pre-paid platforms during evening peak time. This paper evaluates the effects of these multiple policy measures on operation of study busway station. The comparison between pre and post policy scenarios suggests that though boarding time per passenger has decreased, while the alighting time per passenger has increased slightly. However, there is a substantial reduction in operating efficiency was observed at the station.
Resumo:
Public key cryptography, and with it,the ability to compute digital signatures, have made it possible for electronic commerce to flourish. It is thus unsurprising that the proposed Australian NECS will also utilise digital signatures in its system so as to provide a fully automated process from the creation of electronic land title instrument to the digital signing, and electronic lodgment of these instruments. This necessitates an analysis of the fraud risks raised by the usage of digital signatures because a compromise of the integrity of digital signatures will lead to a compromise of the Torrens system itself. This article will show that digital signatures may in fact offer greater security against fraud than handwritten signatures; but to achieve this, digital signatures require an infrastructure whereby each component is properly implemented and managed.
Resumo:
This paper presents an automated system for 3D assembly of tissue engineering (TE) scaffolds made from biocompatible microscopic building blocks with relatively large fabrication error. It focuses on the pin-into-hole force control developed for this demanding microassembly task. A beam-like gripper with integrated force sensing at a 3 mN resolution with a 500 mN measuring range is designed, and is used to implement an admittance force-controlled insertion using commercial precision stages. Visual-based alignment followed by an insertion is complemented by a haptic exploration strategy using force and position information. The system demonstrates fully automated construction of TE scaffolds with 50 microparts whose dimension error is larger than 5%.
Resumo:
Purpose: To analyze the repeatability of measuring nerve fiber length (NFL) from images of the human corneal subbasal nerve plexus using semiautomated software. Methods: Images were captured from the corneas of 50 subjects with type 2 diabetes mellitus who showed varying severity of neuropathy, using the Heidelberg Retina Tomograph 3 with Rostock Corneal Module. Semiautomated nerve analysis software was independently used by two observers to determine NFL from images of the subbasal nerve plexus. This procedure was undertaken on two occasions, 3 days apart. Results: The intraclass correlation coefficient values were 0.95 (95% confidence intervals: 0.92–0.97) for individual subjects and 0.95 (95% confidence intervals: 0.74–1.00) for observer. Bland-Altman plots of the NFL values indicated a reduced spread of data with lower NFL values. The overall spread of data was less for (a) the observer who was more experienced at analyzing nerve fiber images and (b) the second measurement occasion. Conclusions: Semiautomated measurement of NFL in the subbasal nerve fiber layer is highly repeatable. Repeatability can be enhanced by using more experienced observers. It may be possible to markedly improve repeatability when measuring this anatomic structure using fully automated image analysis software.
Resumo:
Acoustic sensors play an important role in augmenting the traditional biodiversity monitoring activities carried out by ecologists and conservation biologists. With this ability however comes the burden of analysing large volumes of complex acoustic data. Given the complexity of acoustic sensor data, fully automated analysis for a wide range of species is still a significant challenge. This research investigates the use of citizen scientists to analyse large volumes of environmental acoustic data in order to identify bird species. Specifically, it investigates ways in which the efficiency of a user can be improved through the use of species identification tools and the use of reputation models to predict the accuracy of users with unidentified skill levels. Initial experimental results are reported.
The use of virtual prototyping to rehearse the sequence of construction work involving mobile cranes
Resumo:
Purpose – Rehearsing practical site operations is without doubt one of the most effective methods for minimising planning mistakes, because of the learning that takes place during the rehearsal activity. However, real rehearsal is not a practical solution for on-site construction activities, as it not only involves a considerable amount of cost but can also have adverse environmental implications. One approach to overcoming this is by the use of virtual rehearsals. The purpose of this paper is to investigate an approach to simulation of the motion of cranes in order to test the feasibility of associated construction sequencing and generate construction schedules for review and visualisation. Design/methodology/approach – The paper describes a system involving two technologies, virtual prototyping (VP) and four-dimensional (4D) simulation, to assist construction planners in testing the sequence of construction activities when mobile cranes are involved. The system consists of five modules, comprising input, database, equipment, process and output, and is capable of detecting potential collisions. A real-world trial is described in which the system was tested and validated. Findings – Feedback from the planners involved in the trial indicated that they found the system to be useful in its present form and that they would welcome its further development into a fully automated platform for validating construction sequencing decisions. Research limitations/implications – The tool has the potential to provide a cost-effective means of improving construction planning. However, it is limited at present to the specific case of crane movement under special consideration. Originality/value – This paper presents a large-scale, real life case of applying VP technology in planning construction processes and activities.
Resumo:
Irrigation is known to stimulate soil microbial carbon and nitrogen turnover and potentially the emissions of nitrous oxide (N2O) and carbon dioxide (CO2). We conducted a study to evaluate the effect of three different irrigation intensities on soil N2O and CO2 fluxes and to determine if irrigation management can be used to mitigate N2O emissions from irrigated cotton on black vertisols in South-Eastern Queensland, Australia. Fluxes were measured over the entire 2009/2010 cotton growing season with a fully automated chamber system that measured emissions on a sub-daily basis. Irrigation intensity had a significant effect on CO2 emission. More frequent irrigation stimulated soil respiration and seasonal CO2 fluxes ranged from 2.7 to 4.1 Mg-C ha−1 for the treatments with the lowest and highest irrigation frequency, respectively. N2O emission happened episodic with highest emissions when heavy rainfall or irrigation coincided with elevated soil mineral N levels and seasonal emissions ranged from 0.80 to 1.07 kg N2O-N ha−1 for the different treatments. Emission factors (EF = proportion of N fertilizer emitted as N2O) over the cotton cropping season, uncorrected for background emissions, ranged from 0.40 to 0.53 % of total N applied for the different treatments. There was no significant effect of the different irrigation treatments on soil N2O fluxes because highest emission happened in all treatments following heavy rainfall caused by a series of summer thunderstorms which overrode the effect of the irrigation treatment. However, higher irrigation intensity increased the cotton yield and therefore reduced the N2O intensity (N2O emission per lint yield) of this cropping system. Our data suggest that there is only limited scope to reduce absolute N2O emissions by different irrigation intensities in irrigated cotton systems with summer dominated rainfall. However, the significant impact of the irrigation treatments on the N2O intensity clearly shows that irrigation can easily be used to optimize the N2O intensity of such a system.
Resumo:
Background and Aims: Irrigation management affects soil water dynamics as well as the soil microbial carbon and nitrogen turnover and potentially the biosphere-atmosphere exchange of greenhouse gasses (GHG). We present a study on the effect of three irrigation treatments on the emissions of nitrous oxide (N2O) from irrigated wheat on black vertisols in South-Eastern Queensland, Australia. Methods: Soil N2O fluxes from wheat were monitored over one season with a fully automated system that measured emissions on a sub-daily basis. Measurements were taken from 3 subplots for each treatment within a randomized split-plot design. Results: Highest N2O emissions occurred after rainfall or irrigation and the amount of irrigation water applied was found to influence the magnitude of these “emission pulses”. Daily N2O emissions varied from -0.74 to 20.46 g N2O-N ha-1 day-1 resulting in seasonal losses ranging from 0.43 to 0.75 kg N2O N ha-1 season -1 for the different irrigation treatments. Emission factors (EF = proportion of N fertilizer emitted as N2O) over the wheat cropping season, uncorrected for background emissions, ranged from 0.2 to 0.4% of total N applied for the different treatments. Highest seasonal N2O emissions were observed in the treatment with the highest irrigation intensity; however, the N2O intensity (N2O emission per crop yield) was highest in the treatment with the lowest irrigation intensity. Conclusions: Our data suggest that timing and amount of irrigation can effectively be used to reduce N2O losses from irrigated agricultural systems; however, in order to develop sustainable mitigation strategies the N2O intensity of a cropping system is an important concept that needs to be taken into account.
Resumo:
Internet chatrooms are common means of interaction and communications, and they carry valuable information about formal or ad-hoc formation of groups with diverse objectives. This work presents a fully automated surveillance system for data collection and analysis in Internet chatrooms. The system has two components: First, it has an eavesdropping tool which collects statistics on individual (chatter) and chatroom behavior. This data can be used to profile a chatroom and its chatters. Second, it has a computational discovery algorithm based on Singular Value Decomposition (SVD) to locate hidden communities and communication patterns within a chatroom. The eavesdropping tool is used for fine tuning the SVD-based discovery algorithm which can be deployed in real-time and requires no semantic information processing. The evaluation of the system on real data shows that (i) statistical properties of different chatrooms vary significantly, thus profiling is possible, (ii) SVD-based algorithm has up to 70-80% accuracy to discover groups of chatters.