977 resultados para Pattern Informatics Method
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
For crime scene investigation in cases of homicide, the pattern of bloodstains at the incident site is of critical importance. The morphology of the bloodstain pattern serves to determine the approximate blood source locations, the minimum number of blows and the positioning of the victim. In the present work, the benefits of the three-dimensional bloodstain pattern analysis, including the ballistic approximation of the trajectories of the blood drops, will be demonstrated using two illustrative cases. The crime scenes were documented in 3D, using the non-contact methods digital photogrammetry, tachymetry and laser scanning. Accurate, true-to-scale 3D models of the crime scenes, including the bloodstain pattern and the traces, were created. For the determination of the areas of origin of the bloodstain pattern, the trajectories of up to 200 well-defined bloodstains were analysed in CAD and photogrammetry software. The ballistic determination of the trajectories was performed using ballistics software. The advantages of this method are the short preparation time on site, the non-contact measurement of the bloodstains and the high accuracy of the bloodstain analysis. It should be expected that this method delivers accurate results regarding the number and position of the areas of origin of bloodstains, in particular the vertical component is determined more precisely than using conventional methods. In both cases relevant forensic conclusions regarding the course of events were enabled by the ballistic bloodstain pattern analysis.
Resumo:
This paper describes informatics for cross-sample analysis with comprehensive two-dimensional gas chromatography (GCxGC) and high-resolution mass spectrometry (HRMS). GCxGC-HRMS analysis produces large data sets that are rich with information, but highly complex. The size of the data and volume of information requires automated processing for comprehensive cross-sample analysis, but the complexity poses a challenge for developing robust methods. The approach developed here analyzes GCxGC-HRMS data from multiple samples to extract a feature template that comprehensively captures the pattern of peaks detected in the retention-times plane. Then, for each sample chromatogram, the template is geometrically transformed to align with the detected peak pattern and generate a set of feature measurements for cross-sample analyses such as sample classification and biomarker discovery. The approach avoids the intractable problem of comprehensive peak matching by using a few reliable peaks for alignment and peak-based retention-plane windows to define comprehensive features that can be reliably matched for cross-sample analysis. The informatics are demonstrated with a set of 18 samples from breast-cancer tumors, each from different individuals, six each for Grades 1-3. The features allow classification that matches grading by a cancer pathologist with 78% success in leave-one-out cross-validation experiments. The HRMS signatures of the features of interest can be examined for determining elemental compositions and identifying compounds.
Resumo:
It has long been known that trypanosomes regulate mitochondrial biogenesis during the life cycle of the parasite; however, the mitochondrial protein inventory (MitoCarta) and its regulation remain unknown. We present a novel computational method for genome-wide prediction of mitochondrial proteins using a support vector machine-based classifier with approximately 90% prediction accuracy. Using this method, we predicted the mitochondrial localization of 468 proteins with high confidence and have experimentally verified the localization of a subset of these proteins. We then applied a recently developed parallel sequencing technology to determine the expression profiles and the splicing patterns of a total of 1065 predicted MitoCarta transcripts during the development of the parasite, and showed that 435 of the transcripts significantly changed their expressions while 630 remain unchanged in any of the three life stages analyzed. Furthermore, we identified 298 alternatively splicing events, a small subset of which could lead to dual localization of the corresponding proteins.
Resumo:
Full axon counting of optic nerve cross-sections represents the most accurate method to quantify axonal damage, but such analysis is very labour intensive. Recently, a new method has been developed, termed targeted sampling, which combines the salient features of a grading scheme with axon counting. Preliminary findings revealed the method compared favourably with random sampling. The aim of the current study was to advance our understanding of the effect of sampling patterns on axon counts by comparing estimated axon counts from targeted sampling with those obtained from fixed-pattern sampling in a large collection of optic nerves with different severities of axonal injury.
Resumo:
Functional magnetic resonance imaging (fMRI) is presently either performed using blood oxygenation level-dependent (BOLD) contrast or using cerebral blood flow (CBF), measured with arterial spin labeling (ASL) technique. The present fMRI study aimed to provide practical hints to favour one method over the other. It involved three different acquisition methods during visual checkerboard stimulation on nine healthy subjects: 1) CBF contrast obtained from ASL, 2) BOLD contrast extracted from ASL and 3) BOLD contrast from Echo planar imaging. Previous findings were replicated; i) no differences between the three measurements were found in the location of the activated region; ii) differences were found in the temporal characteristics of the signals and iii) BOLD has significantly higher sensitivity than ASL perfusion. ASL fMRI was favoured when the investigation demands for perfusion and task related signal changes. BOLD fMRI is more suitable in conjunction with fast event related design.
Resumo:
Much controversy exists over whether the course of schizophrenia, as defined by the lengths of repeated community tenures, is progressively ameliorating or deteriorating. This article employs a new statistical method proposed by Wang and Chen (2000) to analyze the Denmark registry data in Eaton, et al (1992). The new statistical method correctly handles the bias caused by induced informative censoring, which is an interaction of the heterogeneity of schizophrenia patients and long-term follow-up. The analysis shows a progressive deterioration pattern in terms of community tenures for the full registry cohort, rather than a progressive amelioration pattern as reported for a selected sub-cohort in Eaton, et al (1992). When adjusted for the long-term chronicity of calendar time, no significant progressive pattern was found for the full cohort.
Resumo:
OBJECT: In this study, 1H magnetic resonance (MR) spectroscopy was prospectively tested as a reliable method for presurgical grading of neuroepithelial brain tumors. METHODS: Using a database of tumor spectra obtained in patients with histologically confirmed diagnoses, 94 consecutive untreated patients were studied using single-voxel 1H spectroscopy (point-resolved spectroscopy; TE 135 msec, TE 135 msec, TR 1500 msec). A total of 90 tumor spectra obtained in patients with diagnostic 1H MR spectroscopy examinations were analyzed using commercially available software (MRUI/VARPRO) and classified using linear discriminant analysis as World Health Organization (WHO) Grade I/II, WHO Grade III, or WHO Grade IV lesions. In all cases, the classification results were matched with histopathological diagnoses that were made according to the WHO classification criteria after serial stereotactic biopsy procedures or open surgery. Histopathological studies revealed 30 Grade I/II tumors, 29 Grade III tumors, and 31 Grade IV tumors. The reliability of the histological diagnoses was validated considering a minimum postsurgical follow-up period of 12 months (range 12-37 months). Classifications based on spectroscopic data yielded 31 tumors in Grade I/II, 32 in Grade III, and 27 in Grade IV. Incorrect classifications included two Grade II tumors, one of which was identified as Grade III and one as Grade IV; two Grade III tumors identified as Grade II; two Grade III lesions identified as Grade IV; and six Grade IV tumors identified as Grade III. Furthermore, one glioblastoma (WHO Grade IV) was classified as WHO Grade I/II. This represents an overall success rate of 86%, and a 95% success rate in differentiating low-grade from high-grade tumors. CONCLUSIONS: The authors conclude that in vivo 1H MR spectroscopy is a reliable technique for grading neuroepithelial brain tumors.
Resumo:
This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.
Resumo:
Objective Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular and for improving healthcare quality and patient safety in general. Method The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. Results The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. Conclusions Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
OBJECTIVE: Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper, a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular as well as improving healthcare quality and patient safety in general. METHOD: The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. RESULTS: The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. CONCLUSIONS: Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
Arterial waves are seen as possible independent mediators of cardiovascular risks, and the wave intensity analysis (WIA) has therefore been proposed as a method for patient selection for ventricular assist device (VAD) implantation. Interpreting measured wave intensity (WI) is challenging and complexity is increased by the implantation of a VAD. The waves generated by the VAD interact with the waves generated by the native heart, and this interaction varies with changing VAD settings. Eight sheep were implanted with a pulsatile VAD (PVAD) through ventriculo-aortic cannulation. The start of PVAD ejection was synchronized to the native R-wave and delayed between 0 % - 90 % of the cardiac cycle in 10 % steps or phase shifts (PS). Pressure and velocity signals were registered, using a combined Doppler and pressure wire positioned in the abdominal aorta, and used to calculate the WI. Depending on the PS, different wave interference phenomena occurred. Maximum unloading of the left ventricle (LV) coincided with constructive interference and maximum blood flow pulsatility, and maximum loading of the LV coincided with destructive interference and minimum blood flow pulsatility. We believe, that non-invasive WIA could potentially be used clinically to assess the mechanical load of the LV, and to monitor the peripheral hemodynamics such as blood flow pulsatility and risk of intestinal bleeding.
Resumo:
OBJECTIVES: Proteomics approaches to cardiovascular biology and disease hold the promise of identifying specific proteins and peptides or modification thereof to assist in the identification of novel biomarkers. METHOD: By using surface-enhanced laser desorption and ionization time of flight mass spectroscopy (SELDI-TOF-MS) serum peptide and protein patterns were detected enabling to discriminate between postmenopausal women with and without hormone replacement therapy (HRT). RESULTS: Serum of 13 HRT and 27 control subjects was analyzed and 42 peptides and proteins could be tentatively identified based on their molecular weight and binding characteristics on the chip surface. By using decision tree-based Biomarker Patternstrade mark Software classification and regression analysis a discriminatory function was developed allowing to distinguish between HRT women and controls correctly and, thus, yielding a sensitivity of 100% and a specificity of 100%. The results show that peptide and protein patterns have the potential to deliver novel biomarkers as well as pinpointing targets for improved treatment. The biomarkers obtained represent a promising tool to discriminate between HRT users and non-users. CONCLUSION: According to a tentative identification of the markers by their molecular weight and binding characteristics, most of them appear to be part of the inflammation induced acute-phase response
Resumo:
OBJECTIVE The aim of this study was to evaluate whether the distribution pattern of early ischemic changes in the initial MRI allows a practical method for estimating leptomeningeal collateralization in acute ischemic stroke (AIS). METHODS Seventy-four patients with AIS underwent MRI followed by conventional angiogram and mechanical thrombectomy. Diffusion restriction in Diffusion weighted imaging (DWI) and correlated T2-hyperintensity of the infarct were retrospectively analyzed and subdivided in accordance with Alberta Stroke Program Early CT score (ASPECTS). Patients were angiographically graded in collateralization groups according to the method of Higashida, and dichotomized in 2 groups: 29 subjects with collateralization grade 3 or 4 (well-collateralized group) and 45 subjects with grade 1 or 2 (poorly-collateralized group). Individual ASPECTS areas were compared among the groups. RESULTS Means for overall DWI-ASPECTS were 6.34 vs. 4.51 (well vs. poorly collateralized groups respectively), and for T2-ASPECTS 9.34 vs 8.96. A significant difference between groups was found for DWI-ASPECTS (p<0.001), but not for T2-ASPECTS (p = 0.088). Regarding the individual areas, only insula, M1-M4 and M6 showed significantly fewer infarctions in the well-collateralized group (p-values <0.001 to 0.015). 89% of patients in the well-collateralized group showed 0-2 infarctions in these six areas (44.8% with 0 infarctions), while 59.9% patients of the poor-collateralized group showed 3-6 infarctions. CONCLUSION Patients with poor leptomeningeal collateralization show more infarcts on the initial MRI, particularly in the ASPECTS areas M1 to M4, M6 and insula. Therefore DWI abnormalities in these areas may be a surrogate marker for poor leptomeningeal collaterals and may be useful for estimation of the collateral status in routine clinical evaluation.
Resumo:
Five test runs were performed to assess possible bias when performing the loss on ignition (LOI) method to estimate organic matter and carbonate content of lake sediments. An accurate and stable weight loss was achieved after 2 h of burning pure CaCO3 at 950 °C, whereas LOI of pure graphite at 530 °C showed a direct relation to sample size and exposure time, with only 40-70% of the possible weight loss reached after 2 h of exposure and smaller samples losing weight faster than larger ones. Experiments with a standardised lake sediment revealed a strong initial weight loss at 550 °C, but samples continued to lose weight at a slow rate at exposure of up to 64 h, which was likely the effect of loss of volatile salts, structural water of clay minerals or metal oxides, or of inorganic carbon after the initial burning of organic matter. A further test-run revealed that at 550 °C samples in the centre of the furnace lost more weight than marginal samples. At 950 °C this pattern was still apparent but the differences became negligible. Again, LOI was dependent on sample size. An analytical LOI quality control experiment including ten different laboratories was carried out using each laboratory's own LOI procedure as well as a standardised LOI procedure to analyse three different sediments. The range of LOI values between laboratories measured at 550 °C was generally larger when each laboratory used its own method than when using the standard method. This was similar for 950 °C, although the range of values tended to be smaller. The within-laboratory range of LOI measurements for a given sediment was generally small. Comparisons of the results of the individual and the standardised method suggest that there is a laboratory-specific pattern in the results, probably due to differences in laboratory equipment and/or handling that could not be eliminated by standardising the LOI procedure. Factors such as sample size, exposure time, position of samples in the furnace and the laboratory measuring affected LOI results, with LOI at 550 °C being more susceptible to these factors than LOI at 950 °C. We, therefore, recommend analysts to be consistent in the LOI method used in relation to the ignition temperatures, exposure times, and the sample size and to include information on these three parameters when referring to the method.