954 resultados para Computer software -- Reliability


Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the challenges to biomedical engineers proposed by researchers in neuroscience is brain machine interaction. The nervous system communicates by interpreting electrochemical signals, and implantable circuits make decisions in order to interact with the biological environment. It is well known that Parkinson’s disease is related to a deficit of dopamine (DA). Different methods has been employed to control dopamine concentration like magnetic or electrical stimulators or drugs. In this work was automatically controlled the neurotransmitter concentration since this is not currently employed. To do that, four systems were designed and developed: deep brain stimulation (DBS), transmagnetic stimulation (TMS), Infusion Pump Control (IPC) for drug delivery, and fast scan cyclic voltammetry (FSCV) (sensing circuits which detect varying concentrations of neurotransmitters like dopamine caused by these stimulations). Some softwares also were developed for data display and analysis in synchronously with current events in the experiments. This allowed the use of infusion pumps and their flexibility is such that DBS or TMS can be used in single mode and other stimulation techniques and combinations like lights, sounds, etc. The developed system allows to control automatically the concentration of DA. The resolution of the system is around 0.4 µmol/L with time correction of concentration adjustable between 1 and 90 seconds. The system allows controlling DA concentrations between 1 and 10 µmol/L, with an error about +/- 0.8 µmol/L. Although designed to control DA concentration, the system can be used to control, the concentration of other substances. It is proposed to continue the closed loop development with FSCV and DBS (or TMS, or infusion) using parkinsonian animals models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the past several decades, thousands of otoliths, bivalve shells, and scales have been collected for the purposes of age determination and remain archived in European and North American fisheries laboratories. Advances in digital imaging and computer software combined with techniques developed by tree-ring scientists provide a means by which to extract additional levels of information in these calcified structures and generate annually resolved (one value per year), multidecadal time-series of population-level growth anomalies. Chemical and isotopic properties may also be extracted to provide additional information regarding the environmental conditions these organisms experienced.Given that they are exactly placed in time, chronologies can be directly compared to instrumental climate records, chronologies from other regions or species, or time-seriesof other biological phenomena. In this way, chronologies may be used to reconstruct historical ranges of environmental variability, identify climatic drivers of growth, establish linkages within and among species, and generate ecosystem-level indicators. Following the first workshop in Hamburg, Germany, in December 2014, the second workshop on Growth increment Chronologies in Marine Fish: climate-ecosystem interactions in the North Atlantic (WKGIC2) met at the Mediterranean Institute for Advanced Studies headquarters in Esporles, Spain, on 18–22 April 2016, chaired by Bryan Black (USA) and Christoph Stransky (Germany).Thirty-six participants from fifteen different countries attended. Objectives were to i) review the applications of chronologies developed from growth-increment widths in the hard parts (otoliths, shells, scales) of marine fish and bivalve species ii) review the fundamentals of crossdating and chronology development, iii) discuss assumptions and limitations of these approaches, iv) measure otolith growth-increment widths in image analysis software, v) learn software to statistically check increment dating accuracy, vi) generate a growth increment chronology and relate it to climate indices, and vii) initiate cooperative projects or training exercises to commence after the workshop.The workshop began with an overview of tree-ring techniques of chronology development, including a hands-on exercise in cross dating. Next, we discussed the applications of fish and bivalve biochronologies and the range of issues that could be addressed. We then reviewed key assumptions and limitations, especially those associated with short-lived species for which there are numerous and extensive otolith archives in European fisheries labs. Next, participants were provided with images of European plaice otoliths from the North Sea and taught to measure increment widths in image analysis software. Upon completion of measurements, techniques of chronology development were discussed and contrasted to those that have been applied for long-lived species. Plaice growth time-series were then related to environmental variability using the KNMI Climate Explorer. Finally, potential future collaborations and funding opportunities were discussed, and there was a clear desire to meet again to compare various statistical techniques for chronology development using a range existing fish, bivalve, and tree growth-increment datasets. Overall, we hope to increase the use of these techniques, and over the long term, develop networks of biochronologies for integrative analyses of ecosystem functioning and relationships to long-term climate variability and fishing pressure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increased prevalence of iron deficiency among infants can be attributed to the consumption of an iron deficient diet or a diet that interferes with iron absorption at the critical time of infancy, among other factors. The gradual shift from breast milk to other foods and liquids is a transition period which greatly contributes to iron deficiency anaemia (IDA). The purpose of this research was to assess iron deficiency anaemia among infants aged six to nine months in Keiyo South Sub County. The specific objectives of this study were: to establish the prevalence of infants with iron deficiency anaemia and dietary iron intake among infants aged 6 to 9 months. The cross sectional study design was adopted in this survey. This study was conducted in three health facilities in Keiyo South Sub County. The infants were selected by use of a two stage cluster sampling procedure. Systematic random sampling was then used to select a total of 244 mothers and their infants. Eighty two (82) infants were selected from Kamwosor sub-district hospital and eighty one (81) from both Nyaru and Chepkorio health facilities. Interview schedules, 24-hour dietary recall and food frequency questionnaires were used for collection of dietary iron intake. Biochemical tests were carried out by use of the Hemo-control photochrometer at the health facilities. Infants whose hemoglobin levels were less than 11g/dl were considered anaemic. Further, peripheral blood smears were conducted to ascertain the type of nutritional anaemia. Data was analyzed using the Statistical Package for Social Sciences (SPSS) computer software version 17, 2009. Dietary iron intake was analyzed using the NutriSurvey 2007 computer software. Results indicated that the mean hemoglobin values were 11.3± 0.84 g/dl. Twenty one percent (21.7%) of the infants had anaemia and further 100% of peripheral blood smears indicated iron deficiency anaemia. Dietary iron intake was a predictor of iron deficiency anaemia in this study (t=-3.138; p=0.01). Iron deficiency anaemia was evident among infants in Keiyo South Sub County. The Ministry of Health should formulate and implement policies on screening for anaemia and ensure intensive nutrition education on iron rich diets during child welfare clinics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Predicting user behaviour enables user assistant services provide personalized services to the users. This requires a comprehensive user model that can be created by monitoring user interactions and activities. BaranC is a framework that performs user interface (UI) monitoring (and collects all associated context data), builds a user model, and supports services that make use of the user model. A prediction service, Next-App, is built to demonstrate the use of the framework and to evaluate the usefulness of such a prediction service. Next-App analyses a user's data, learns patterns, makes a model for a user, and finally predicts, based on the user model and current context, what application(s) the user is likely to want to use. The prediction is pro-active and dynamic, reflecting the current context, and is also dynamic in that it responds to changes in the user model, as might occur over time as a user's habits change. Initial evaluation of Next-App indicates a high-level of satisfaction with the service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A comprehensive user model, built by monitoring a user's current use of applications, can be an excellent starting point for building adaptive user-centred applications. The BaranC framework monitors all user interaction with a digital device (e.g. smartphone), and also collects all available context data (such as from sensors in the digital device itself, in a smart watch, or in smart appliances) in order to build a full model of user application behaviour. The model built from the collected data, called the UDI (User Digital Imprint), is further augmented by analysis services, for example, a service to produce activity profiles from smartphone sensor data. The enhanced UDI model can then be the basis for building an appropriate adaptive application that is user-centred as it is based on an individual user model. As BaranC supports continuous user monitoring, an application can be dynamically adaptive in real-time to the current context (e.g. time, location or activity). Furthermore, since BaranC is continuously augmenting the user model with more monitored data, over time the user model changes, and the adaptive application can adapt gradually over time to changing user behaviour patterns. BaranC has been implemented as a service-oriented framework where the collection of data for the UDI and all sharing of the UDI data are kept strictly under the user's control. In addition, being service-oriented allows (with the user's permission) its monitoring and analysis services to be easily used by 3rd parties in order to provide 3rd party adaptive assistant services. An example 3rd party service demonstrator, built on top of BaranC, proactively assists a user by dynamic predication, based on the current context, what apps and contacts the user is likely to need. BaranC introduces an innovative user-controlled unified service model of monitoring and use of personal digital activity data in order to provide adaptive user-centred applications. This aims to improve on the current situation where the diversity of adaptive applications results in a proliferation of applications monitoring and using personal data, resulting in a lack of clarity, a dispersal of data, and a diminution of user control.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Axial vertebral rotation, an important parameter in the assessment of scoliosis may be identified on X-ray images. In line with the advances in the field of digital radiography, hospitals have been increasingly using this technique. The objective of the present study was to evaluate the reliability of computer-processed rotation measurements obtained from digital radiographs. A software program was therefore developed, which is able to digitally reproduce the methods of Perdriolle and Raimondi and to calculate semi-automatically the rotation degree of vertebra on digital radiographs. Three independent observers estimated vertebral rotation employing both the digital and the traditional manual methods. Compared to the traditional method, the digital assessment showed a 43% smaller error and a stronger correlation. In conclusion, the digital method seems to be reliable and enhance the accuracy and precision of vertebral rotation measurements.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

PURPOSE: To prospectively evaluate the accuracy and reliability of "freehand" posttraumatic orbital wall reconstruction with AO (Arbeitsgemeinschaft Osteosynthese) titanium mesh plates by using computer-aided volumetric measurement of the bony orbits. METHODS: Bony orbital volume was measured in 12 patients from coronal CT scan slices using OsiriX Medical Image software. After defining the volumetric limits of the orbit, the segmentation of the bony orbital region of interest of each single slice was performed. At the end of the segmentation process, all regions of interest were grouped and the volume was computed. The same procedure was performed on both orbits, and thereafter the volume of the contralateral uninjured orbit was used as a control for comparison. RESULTS: In all patients, the volume data of the reconstructed orbit fitted that of the contralateral uninjured orbit with accuracy to within 1.85 cm3 (7%). CONCLUSIONS: This preliminary study has demonstrated that posttraumatic orbital wall reconstruction using "freehand" bending and placement of AO titanium mesh plates results in a high success rate in re-establishing preoperative bony volume, which closely approximates that of the contralateral uninjured orbit.

Relevância:

50.00% 50.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The XSophe-Sophe-XeprView((R)) computer simulation software suite enables scientists to easily determine spin Hamiltonian parameters from isotropic, randomly oriented and single crystal continuous wave electron paramagnetic resonance (CW EPR) spectra from radicals and isolated paramagnetic metal ion centers or clusters found in metalloproteins, chemical systems and materials science. XSophe provides an X-windows graphical user interface to the Sophe programme and allows: creation of multiple input files, local and remote execution of Sophe, the display of sophelog (output from Sophe) and input parameters/files. Sophe is a sophisticated computer simulation software programme employing a number of innovative technologies including; the Sydney OPera HousE (SOPHE) partition and interpolation schemes, a field segmentation algorithm, the mosaic misorientation linewidth model, parallelization and spectral optimisation. In conjunction with the SOPHE partition scheme and the field segmentation algorithm, the SOPHE interpolation scheme and the mosaic misorientation linewidth model greatly increase the speed of simulations for most spin systems. Employing brute force matrix diagonalization in the simulation of an EPR spectrum from a high spin Cr(III) complex with the spin Hamiltonian parameters g(e) = 2.00, D = 0.10 cm(-1), E/D = 0.25, A(x) = 120.0, A(y) = 120.0, A(z) = 240.0 x 10(-4) cm(-1) requires a SOPHE grid size of N = 400 (to produce a good signal to noise ratio) and takes 229.47 s. In contrast the use of either the SOPHE interpolation scheme or the mosaic misorientation linewidth model requires a SOPHE grid size of only N = 18 and takes 44.08 and 0.79 s, respectively. Results from Sophe are transferred via the Common Object Request Broker Architecture (CORBA) to XSophe and subsequently to XeprView((R)) where the simulated CW EPR spectra (1D and 2D) can be compared to the experimental spectra. Energy level diagrams, transition roadmaps and transition surfaces aid the interpretation of complicated randomly oriented CW EPR spectra and can be viewed with a web browser and an OpenInventor scene graph viewer.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND CONTEXT: The vertebral spine angle in the frontal plane is an important parameter in the assessment of scoliosis and may be obtained from panoramic X-ray images. Technological advances have allowed for an increased use of digital X-ray images in clinical practice. PURPOSE: In this context, the objective of this study is to assess the reliability of computer-assisted Cobb angle measurements taken from digital X-ray images. STUDY DESIGN/SETTING: Clinical investigation quantifying scoliotic deformity with Cobb method to evaluate the intra- and interobserver variability using manual and digital techniques. PATIENT SAMPLE: Forty-nine patients diagnosed with idiopathic scoliosis were chosen based on convenience, without predilection for gender, age, type, location, or magnitude of the curvature. OUTCOME MEASURES: Images were examined to evaluate Cobb angle variability, end plate selection, as well as intra- and interobserver errors. METHODS: Specific software was developed to digitally reproduce the Cobb method and calculate semiautomatically the degree of scoliotic deformity. During the study, three observers estimated the Cobb angle using both the digital and the traditional manual methods. RESULTS: The results showed that Cobb angle measurements may be reproduced in the computer as reliably as with the traditional manual method, in similar conditions to those found in clinical practice. CONCLUSIONS: The computer-assisted method (digital method) is clinically advantageous and appropriate to assess the scoliotic curvature in the frontal plane using Cobb method. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Test of Mouse Proficiency (TOMP) was developed to assist occupational therapists and education professionals assess computer mouse competency skills in children from preschool to upper primary (elementary) school age. The preliminary reliability and validity of TOMP are reported in this paper. Methods used to examine the internal consistency, test-retest reliability, and criterion- and construct-related validity of the test are elaborated. In the continuing process of test refinement, these preliminary studies support to varying degrees the reliability and validity of TOMP. Recommendations for further validation of the assessment are discussed along with indications for potential clinical application.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

For products sold with warranty, the warranty servicing cost can be reduced by improving product reliability through a development process. However, this increases the unit manufacturing cost. Optimal development must achieve a trade-off between these two costs. The outcome of the development process is uncertain and needs to be taken into account in the determination of the optimal development effort. The paper develops a model where this uncertainty is taken into account. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper proposes a new architecture targeting real-time and reliable Distributed Computer-Controlled Systems (DCCS). This architecture provides a structured approach for the integration of soft and/or hard real-time applications with Commercial O -The-Shelf (COTS) components. The Timely Computing Base model is used as the reference model to deal with the heterogeneity of system components with respect to guaranteeing the timeliness of applications. The reliability and availability requirements of hard real-time applications are guaranteed by a software-based fault-tolerance approach.