959 resultados para Smart Vending Machine, Automation, Programmable Logic Controllers, Creativity, Innovation
Resumo:
Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.
Resumo:
The present research deals with the review of the analysis and modeling of Swiss franc interest rate curves (IRC) by using unsupervised (SOM, Gaussian Mixtures) and supervised machine (MLP) learning algorithms. IRC are considered as objects embedded into different feature spaces: maturities; maturity-date, parameters of Nelson-Siegel model (NSM). Analysis of NSM parameters and their temporal and clustering structures helps to understand the relevance of model and its potential use for the forecasting. Mapping of IRC in a maturity-date feature space is presented and analyzed for the visualization and forecasting purposes.
Resumo:
Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.
Resumo:
Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.
Resumo:
Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.
Resumo:
The Iowa Department of Transportation is committed to improved management systems, which in turn has led to increased automation to record and manage construction data. A possible improvement to the current data management system can be found with pen-based computers. Pen-based computers coupled with user friendly software are now to the point where an individual's handwriting can be captured and converted to typed text to be used for data collection. It would appear pen-based computers are sufficiently advanced to be used by construction inspectors to record daily project data. The objective of this research was to determine: (1) if pen-based computers are durable enough to allow maintenance-free operation for field work during Iowa's construction season; and (2) if pen-based computers can be used effectively by inspectors with little computer experience. The pen-based computer's handwriting recognition was not fast or accurate enough to be successfully utilized. The IBM Thinkpad with the pen pointing device did prove useful for working in Windows' graphical environment. The pen was used for pointing, selecting and scrolling in the Windows applications because of its intuitive nature.
Resumo:
Proponents of microalgae biofuel technologies often claim that the world demand of liquid fuels, about 5 trillion liters per year, could be supplied by microalgae cultivated on only a few tens of millions of hectares. This perspective reviews this subject and points out that such projections are greatly exaggerated, because (1) the pro- ductivities achieved in large-scale commercial microalgae production systems, operated year-round, do not surpass those of irrigated tropical crops; (2) cultivating, harvesting and processing microalgae solely for the production of biofuels is simply too expensive using current or prospective technology; and (3) currently available (limited) data suggest that the energy balance of algal biofuels is very poor. Thus, microalgal biofuels are no panacea for depleting oil or global warming, and are unlikely to save the internal combustion machine.
Resumo:
Ligament balance is an important and subjective task performed during total knee arthroplasty (TKA) procedure. For this reason, it is desirable to develop instruments to quantitatively assess the soft-tissue balance since excessive imbalance can accelerate prosthesis wear and lead to early surgical revision. The instrumented distractor proposed in this study can assist surgeons on performing ligament balance by measuring the distraction gap and applied load. Also the device allows the determination of the ligament stiffness which can contribute a better understanding of the intrinsic mechanical behavior of the knee joint. Instrumentation of the device involved the use of hall-sensors for measuring the distractor displacement and strain gauges to transduce the force. The sensors were calibrated and tested to demonstrate their suitability for surgical use. Results show the distraction gap can be measured reliably with 0.1mm accuracy and the distractive loads could be assessed with an accuracy in the range of 4N. These characteristics are consistent with those have been proposed, in this work, for a device that could assist on performing ligament balance while permitting surgeons evaluation based on his experience. Preliminary results from in vitro tests were in accordance with expected stiffness values for medial collateral ligament (MCL) and lateral collateral ligament (LCL).
Resumo:
OBJECTIVE: The major source of hemolysis during cardiopulmonary bypass remains the cardiotomy suction and is primarily due to the interaction between air and blood. The Smart suction system involves an automatically controlled aspiration designed to avoid the mixture of blood with air. This study was set-up to compare this recently designed suction system to a Cell Saver system in order to investigate their effects on blood elements during prolonged intrathoracic aspiration. METHODS: In a calf model (n=10; mean weight, 69.3+/-4.5 kg), a standardized hole was created in the right atrium allowing a blood loss of 100 ml/min, with a suction cannula placed into the chest cavity into a fixed position during 6 h. The blood was continuously aspirated either with the Smart suction system (five animals) or the Cell Saver system (five animals). Blood samples were taken hourly for blood cell counts and biochemistry. RESULTS: In the Smart suction group, red cell count, plasma protein and free hemoglobin levels remained stable, while platelet count exhibited a significant drop from the fifth hour onwards (prebypass: 683+/-201*10(9)/l, 5 h: 280+/-142*10(9)/l, P=0.046). In the Cell Saver group, there was a significant drop of the red cell count from the third hour onwards (prebypass: 8.6+/-0.9*10(12)/l, 6 h: 6.3+/-0.4*10(12)/l, P=0.02), of the platelet count from the first hour onwards (prebypass: 630+/-97*10(9)/l, 1 h: 224+/-75*10(9)/l, P<0.01), and of the plasma protein level from the first hour onwards (prebypass: 61.7+/-0.6 g/l, 1 h: 29.3+/-9.1 g/l, P<0.01). CONCLUSIONS: In this experimental set-up, the Smart suction system avoids damage to red cells and affects platelet count less than the Cell Saver system which induces important blood cell destruction, as any suction device mixing air and blood, as well as severe hypoproteinemia with its metabolic, clotting and hemodynamic consequences.
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
The atomic force microscope is not only a very convenient tool for studying the topography of different samples, but it can also be used to measure specific binding forces between molecules. For this purpose, one type of molecule is attached to the tip and the other one to the substrate. Approaching the tip to the substrate allows the molecules to bind together. Retracting the tip breaks the newly formed bond. The rupture of a specific bond appears in the force-distance curves as a spike from which the binding force can be deduced. In this article we present an algorithm to automatically process force-distance curves in order to obtain bond strength histograms. The algorithm is based on a fuzzy logic approach that permits an evaluation of "quality" for every event and makes the detection procedure much faster compared to a manual selection. In this article, the software has been applied to measure the binding strength between tubuline and microtubuline associated proteins.