1000 resultados para Decision tables
Resumo:
Abstract-To detect errors in decision tables one needs to decide whether a given set of constraints is feasible or not. This paper describes an algorithm to do so when the constraints are linear in variables that take only integer values. Decision tables with such constraints occur frequently in business data processing and in nonnumeric applications. The aim of the algorithm is to exploit. the abundance of very simple constraints that occur in typical decision table contexts. Essentially, the algorithm is a backtrack procedure where the the solution space is pruned by using the set of simple constrains. After some simplications, the simple constraints are captured in an acyclic directed graph with weighted edges. Further, only those partial vectors are considered from extension which can be extended to assignments that will at least satisfy the simple constraints. This is how pruning of the solution space is achieved. For every partial assignment considered, the graph representation of the simple constraints provides a lower bound for each variable which is not yet assigned a value. These lower bounds play a vital role in the algorithm and they are obtained in an efficient manner by updating older lower bounds. Our present algorithm also incorporates an idea by which it can be checked whether or not an (m - 2)-ary vector can be extended to a solution vector of m components, thereby backtracking is reduced by one component.
Resumo:
Process control rules may be specified using decision tables. Such a specification is superior when logical decisions to be taken in control dominate. In this paper we give a method of detecting redundancies, incompleteness, and contradictions in such specifications. Using such a technique thus ensures the validity of the specifications.
Resumo:
Originally presented as the author's thesis, University of Illinois at Urbana-Champaign.
Resumo:
Bibliography: p. 52-53.
Resumo:
In recent years, rough set approach computing issues concerning
reducts of decision tables have attracted the attention of many researchers.
In this paper, we present the time complexity of an algorithm
computing reducts of decision tables by relational database approach. Let
DS = (U, C ∪ {d}) be a consistent decision table, we say that A ⊆ C is a
relative reduct of DS if A contains a reduct of DS. Let s =
Resumo:
Abstract With the phenomenal growth of electronic data and information, there are many demands for the development of efficient and effective systems (tools) to perform the issue of data mining tasks on multidimensional databases. Association rules describe associations between items in the same transactions (intra) or in different transactions (inter). Association mining attempts to find interesting or useful association rules in databases: this is the crucial issue for the application of data mining in the real world. Association mining can be used in many application areas, such as the discovery of associations between customers’ locations and shopping behaviours in market basket analysis. Association mining includes two phases. The first phase, called pattern mining, is the discovery of frequent patterns. The second phase, called rule generation, is the discovery of interesting and useful association rules in the discovered patterns. The first phase, however, often takes a long time to find all frequent patterns; these also include much noise. The second phase is also a time consuming activity that can generate many redundant rules. To improve the quality of association mining in databases, this thesis provides an alternative technique, granule-based association mining, for knowledge discovery in databases, where a granule refers to a predicate that describes common features of a group of transactions. The new technique first transfers transaction databases into basic decision tables, then uses multi-tier structures to integrate pattern mining and rule generation in one phase for both intra and inter transaction association rule mining. To evaluate the proposed new technique, this research defines the concept of meaningless rules by considering the co-relations between data-dimensions for intratransaction-association rule mining. It also uses precision to evaluate the effectiveness of intertransaction association rules. The experimental results show that the proposed technique is promising.
Resumo:
Objective. To investigate mortality in which paracoccidioidomycosis appears on any line or part of the death certificate. Method. Mortality data for 1985-2005 were obtained from the multiple cause-of-death database maintained by the Sao Paulo State Data Analysis System (SEADE). Standardized mortality coefficients were calculated for paracoccidioidomycosis as the underlying cause-of-death and as an associated cause-of-death, as well as for the total number of times paracoccidioidomycosis was mentioned on the death certificates. Results. During this 21-year period, there were 1950 deaths related to paracoccidioidomycosis; the disease was the underlying cause-of-death in 1 164 cases (59.69%) and an associated cause-of-death in 786 (40.31%). Between 1985 and 2005 records show a 59.8% decline in the mortality coefficient due to paracoccidioidomycosis as the underlying cause and a 53.0% decline in the mortality as associated cause. The largest number of deaths occurred among men, in the older age groups, and among rural workers, with an upward trend in winter months. The main causes associated with paracoccidioidomycosis as the underlying cause-of-death were pulmonary fibrosis, chronic lower respiratory tract diseases, and pneumonias. Malignant neoplasms and AIDS were the main underlying causes when paracoccidioidomycosis was an associated cause-of-death. The decision tables had to be adapted for the automated processing of causes of death in death certificates where paracoccidioidomycosis was mentioned. Conclusions. Using the multiple cause-of-death method together with the traditional underlying cause-of-death approach provides a new angle on research aimed at broadening our understanding of the natural history of paracoccidioidomycosis.
Resumo:
Factors influencing the location decisions of offices include traffic, accessibility, employment conditions, economic prospects and land-use policies. Hence tools for supporting real-estate managers and urban planners in such multidimensional decisions may be useful. Accordingly, the objective of this study is to develop a GIS-based tool to support firms who seek office accommodation within a given regional or national study area. The tool relies on a matching approach, in which a firm's characteristics (demand) on the one hand, and environmental conditions and available office spaces (supply) on the other, are analyzed separately in a first step, after which a match is sought. That is, a suitability score is obtained for every firm and for every available office space by applying some value judgments (satisfaction, utility etc.). The latter are powered by a focus on location aspects and expert knowledge about the location decisions of firms/organizations with respect to office accommodation as acquired from a group of real-estate advisers; it is stored in decision tables, and they constitute the core of the model. Apart from the delineation of choice sets for any firm seeking a location, the tool supports two additional types of queries. Firstly, it supports the more generic problem of optimally allocating firms to a set of vacant locations. Secondly, the tool allows users to find firms which meet the characteristics of any given location. Moreover, as a GIS-based tool, its results can be visualized using GIS features which, in turn, facilitate several types of analyses.
Resumo:
Los sensores inerciales (acelerómetros y giróscopos) se han ido introduciendo poco a poco en dispositivos que usamos en nuestra vida diaria gracias a su minituarización. Hoy en día todos los smartphones contienen como mínimo un acelerómetro y un magnetómetro, siendo complementados en losmás modernos por giróscopos y barómetros. Esto, unido a la proliferación de los smartphones ha hecho viable el diseño de sistemas basados en las medidas de sensores que el usuario lleva colocados en alguna parte del cuerpo (que en un futuro estarán contenidos en tejidos inteligentes) o los integrados en su móvil. El papel de estos sensores se ha convertido en fundamental para el desarrollo de aplicaciones contextuales y de inteligencia ambiental. Algunos ejemplos son el control de los ejercicios de rehabilitación o la oferta de información referente al sitio turístico que se está visitando. El trabajo de esta tesis contribuye a explorar las posibilidades que ofrecen los sensores inerciales para el apoyo a la detección de actividad y la mejora de la precisión de servicios de localización para peatones. En lo referente al reconocimiento de la actividad que desarrolla un usuario, se ha explorado el uso de los sensores integrados en los dispositivos móviles de última generación (luz y proximidad, acelerómetro, giróscopo y magnetómetro). Las actividades objetivo son conocidas como ‘atómicas’ (andar a distintas velocidades, estar de pie, correr, estar sentado), esto es, actividades que constituyen unidades de actividades más complejas como pueden ser lavar los platos o ir al trabajo. De este modo, se usan algoritmos de clasificación sencillos que puedan ser integrados en un móvil como el Naïve Bayes, Tablas y Árboles de Decisión. Además, se pretende igualmente detectar la posición en la que el usuario lleva el móvil, no sólo con el objetivo de utilizar esa información para elegir un clasificador entrenado sólo con datos recogidos en la posición correspondiente (estrategia que mejora los resultados de estimación de la actividad), sino también para la generación de un evento que puede producir la ejecución de una acción. Finalmente, el trabajo incluye un análisis de las prestaciones de la clasificación variando el tipo de parámetros y el número de sensores usados y teniendo en cuenta no sólo la precisión de la clasificación sino también la carga computacional. Por otra parte, se ha propuesto un algoritmo basado en la cuenta de pasos utilizando informaiii ción proveniente de un acelerómetro colocado en el pie del usuario. El objetivo final es detectar la actividad que el usuario está haciendo junto con la estimación aproximada de la distancia recorrida. El algoritmo de cuenta pasos se basa en la detección de máximos y mínimos usando ventanas temporales y umbrales sin requerir información específica del usuario. El ámbito de seguimiento de peatones en interiores es interesante por la falta de un estándar de localización en este tipo de entornos. Se ha diseñado un filtro extendido de Kalman centralizado y ligeramente acoplado para fusionar la información medida por un acelerómetro colocado en el pie del usuario con medidas de posición. Se han aplicado también diferentes técnicas de corrección de errores como las de velocidad cero que se basan en la detección de los instantes en los que el pie está apoyado en el suelo. Los resultados han sido obtenidos en entornos interiores usando las posiciones estimadas por un sistema de triangulación basado en la medida de la potencia recibida (RSS) y GPS en exteriores. Finalmente, se han implementado algunas aplicaciones que prueban la utilidad del trabajo desarrollado. En primer lugar se ha considerado una aplicación de monitorización de actividad que proporciona al usuario información sobre el nivel de actividad que realiza durante un período de tiempo. El objetivo final es favorecer el cambio de comportamientos sedentarios, consiguiendo hábitos saludables. Se han desarrollado dos versiones de esta aplicación. En el primer caso se ha integrado el algoritmo de cuenta pasos en una plataforma OSGi móvil adquiriendo los datos de un acelerómetro Bluetooth colocado en el pie. En el segundo caso se ha creado la misma aplicación utilizando las implementaciones de los clasificadores en un dispositivo Android. Por otro lado, se ha planteado el diseño de una aplicación para la creación automática de un diario de viaje a partir de la detección de eventos importantes. Esta aplicación toma como entrada la información procedente de la estimación de actividad y de localización además de información almacenada en bases de datos abiertas (fotos, información sobre sitios) e información sobre sensores reales y virtuales (agenda, cámara, etc.) del móvil. Abstract Inertial sensors (accelerometers and gyroscopes) have been gradually embedded in the devices that people use in their daily lives thanks to their miniaturization. Nowadays all smartphones have at least one embedded magnetometer and accelerometer, containing the most upto- date ones gyroscopes and barometers. This issue, together with the fact that the penetration of smartphones is growing steadily, has made possible the design of systems that rely on the information gathered by wearable sensors (in the future contained in smart textiles) or inertial sensors embedded in a smartphone. The role of these sensors has become key to the development of context-aware and ambient intelligent applications. Some examples are the performance of rehabilitation exercises, the provision of information related to the place that the user is visiting or the interaction with objects by gesture recognition. The work of this thesis contributes to explore to which extent this kind of sensors can be useful to support activity recognition and pedestrian tracking, which have been proven to be essential for these applications. Regarding the recognition of the activity that a user performs, the use of sensors embedded in a smartphone (proximity and light sensors, gyroscopes, magnetometers and accelerometers) has been explored. The activities that are detected belong to the group of the ones known as ‘atomic’ activities (e.g. walking at different paces, running, standing), that is, activities or movements that are part of more complex activities such as doing the dishes or commuting. Simple, wellknown classifiers that can run embedded in a smartphone have been tested, such as Naïve Bayes, Decision Tables and Trees. In addition to this, another aim is to estimate the on-body position in which the user is carrying the mobile phone. The objective is not only to choose a classifier that has been trained with the corresponding data in order to enhance the classification but also to start actions. Finally, the performance of the different classifiers is analysed, taking into consideration different features and number of sensors. The computational and memory load of the classifiers is also measured. On the other hand, an algorithm based on step counting has been proposed. The acceleration information is provided by an accelerometer placed on the foot. The aim is to detect the activity that the user is performing together with the estimation of the distance covered. The step counting strategy is based on detecting minima and its corresponding maxima. Although the counting strategy is not innovative (it includes time windows and amplitude thresholds to prevent under or overestimation) no user-specific information is required. The field of pedestrian tracking is crucial due to the lack of a localization standard for this kind of environments. A loosely-coupled centralized Extended Kalman Filter has been proposed to perform the fusion of inertial and position measurements. Zero velocity updates have been applied whenever the foot is detected to be placed on the ground. The results have been obtained in indoor environments using a triangulation algorithm based on RSS measurements and GPS outdoors. Finally, some applications have been designed to test the usefulness of the work. The first one is called the ‘Activity Monitor’ whose aim is to prevent sedentary behaviours and to modify habits to achieve desired objectives of activity level. Two different versions of the application have been implemented. The first one uses the activity estimation based on the step counting algorithm, which has been integrated in an OSGi mobile framework acquiring the data from a Bluetooth accelerometer placed on the foot of the individual. The second one uses activity classifiers embedded in an Android smartphone. On the other hand, the design of a ‘Travel Logbook’ has been planned. The input of this application is the information provided by the activity and localization modules, external databases (e.g. pictures, points of interest, weather) and mobile embedded and virtual sensors (agenda, camera, etc.). The aim is to detect important events in the journey and gather the information necessary to store it as a journal page.
Resumo:
Performing activity recognition using the information provided by the different sensors embedded in a smartphone face limitations due to the capabilities of those devices when the computations are carried out in the terminal. In this work a fuzzy inference module is implemented in order to decide which classifier is the most appropriate to be used at a specific moment regarding the application requirements and the device context characterized by its battery level, available memory and CPU load. The set of classifiers that is considered is composed of Decision Tables and Trees that have been trained using different number of sensors and features. In addition, some classifiers perform activity recognition regardless of the on-body device position and others rely on the previous recognition of that position to use a classifier that is trained with measurements gathered with the mobile placed on that specific position. The modules implemented show that an evaluation of the classifiers allows sorting them so the fuzzy inference module can choose periodically the one that best suits the device context and application requirements.
Resumo:
Four variations on Two Envelope Paradox are stated and compared. The variations are employed to provide a diagnosis and an explanation of what has gone awry in the paradoxical modeling of the decision problem that the paradox poses. The canonical formulation of the paradox underdescribes the ways in which one envelope can have twice the amount that is in the other. Some ways one envelope can have twice the amount that is in the other make it rational to prefer the envelope that was originally rejected. Some do not, and it is a mistake to treat them alike. The nature of the mistake is diagnosed by the different roles that rigid designators and definite descriptions play in unproblematic and in untoward formulations of decision tables that are employed in setting out the decision problem that gives rise to the paradox. The decision maker’s knowledge or ignorance of how one envelope came to have twice the amount that is in the other determines which of the different ways of modeling his decision problem is correct. Under this diagnosis, the paradoxical modeling of the Two Envelope problem is incoherent.
Resumo:
In this paper, we present a formal hardware verification framework linking ASM with MDG. ASM (Abstract State Machine) is a state based language for describing transition systems. MDG (Multiway Decision Graphs) provides symbolic representation of transition systems with support of abstract sorts and functions. We implemented a transformation tool that automatically generates MDG models from ASM specifications, then formal verification techniques provided by the MDG tool, such as model checking or equivalence checking, can be applied on the generated models. We support this work with a case study of an Island Tunnel Controller, which behavior and structure were specified in ASM then using our ASM-MDG tool successfully verified within the MDG tool.
Resumo:
To evaluate the influence of examiner's clinical experience on detection and treatment decision of caries lesions in primary molars. Design Three experienced dentists (Group A) and three undergraduate students (Group B) used the International Caries Detection and Assessment System (ICDAS) criteria and bitewing radiographs (BW) to perform examinations twice in 77 primary molars that presented a sound or carious occlusal surface. For the treatment decision (TD), the examiners attributed scores, analyzing the teeth in conjunction with the radiographs. The presence and the depth of lesion were validated histologically, and reproducibility was evaluated. The sensitivity, specificity, accuracy, and area under the ROC curve values were calculated for ICDAS and BW. The associations between ICDAS, BW, and TD were analyzed by means of contingency tables. Results Interexaminer agreement for ICDAS, BW, and TD were excellent for Group B and moderate for Group A. The two groups presented similar and satisfactory performance for caries lesion detection using ICDAS and BW. In the treatment decision, Group A was shown to have a less invasive approach than Group B. Conclusion The examiner's experience was not determinant for the clinical and radiographic detection of occlusal lesions in primary teeth but influenced the treatment decision of initial lesions.
Resumo:
BACKGROUND Pretreatment tables for the prediction of pathologic stage have been published and validated for localized prostate cancer (PCa). No such tables are available for locally advanced (cT3a) PCa. OBJECTIVE To construct tables predicting pathologic outcome after radical prostatectomy (RP) for patients with cT3a PCa with the aim to help guide treatment decisions in clinical practice. DESIGN, SETTING, AND PARTICIPANTS This was a multicenter retrospective cohort study including 759 consecutive patients with cT3a PCa treated with RP between 1987 and 2010. INTERVENTION Retropubic RP and pelvic lymphadenectomy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Patients were divided into pretreatment prostate-specific antigen (PSA) and biopsy Gleason score (GS) subgroups. These parameters were used to construct tables predicting pathologic outcome and the presence of positive lymph nodes (LNs) after RP for cT3a PCa using ordinal logistic regression. RESULTS AND LIMITATIONS In the model predicting pathologic outcome, the main effects of biopsy GS and pretreatment PSA were significant. A higher GS and/or higher PSA level was associated with a more unfavorable pathologic outcome. The validation procedure, using a repeated split-sample method, showed good predictive ability. Regression analysis also showed an increasing probability of positive LNs with increasing PSA levels and/or higher GS. Limitations of the study are the retrospective design and the long study period. CONCLUSIONS These novel tables predict pathologic stage after RP for patients with cT3a PCa based on pretreatment PSA level and biopsy GS. They can be used to guide decision making in men with locally advanced PCa. PATIENT SUMMARY Our study might provide physicians with a useful tool to predict pathologic stage in locally advanced prostate cancer that might help select patients who may need multimodal treatment.