996 resultados para Code set
Resumo:
Report on applying agreed-upon procedures to the Villisca Municipal Power Plant’s accounting procedures, cash and investment balances and compliance with Code of Iowa requirements for the period February 1, 2007 through December 31, 2010
Resumo:
This paper presents an ITK implementation for exportingthe contours of the automated segmentation results toDICOM-RT Structure Set format. The âeurooeradiotherapystructure setâeuro (RTSTRUCT) object of the DICOM standard isused for the transfer of patient structures and relateddata, between the devices found within and outside theradiotherapy department. It mainly contains theinformation of regions of interest (ROIs) and points ofinterest (E.g. dose reference points). In many cases,rather than manually drawing these ROIs on the CT images,one can indeed benefit from the automated segmentationalgorithms already implemented in ITK. But at present, itis not possible to export the ROIs obtained from ITK toRTSTRUCT format. In order to bridge this gap, we havedeveloped a framework for exporting contour data toRTSTRUCT. We provide here the complete implementation ofRTSTRUCT exporter and present the details of the pipelineused. Results on a 3-D CT image of the Head and Neck(H&N) region are presented.
Resumo:
The relativistic distorted-wave Born approximation is used to calculate differential and total cross sections for inner shell ionization of neutral atoms by electron and positron impact. The target atom is described within the independent-electron approximation using the self-consistent Dirac-Fock-Slater potential. The distorting potential for the projectile is also set equal to the Dirac-Fock-Slater potential. For electrons, this guarantees orthogonality of all the orbitals involved and simplifies the calculation of exchange T-matrix elements. The interaction between the projectile and the target electrons is assumed to reduce to the instantaneous Coulomb interaction. The adopted numerical algorithm allows the calculation of differential and total cross sections for projectiles with kinetic energies ranging from the ionization threshold up to about ten times this value. Algorithm accuracy and stability are demonstrated by comparing differential cross sections calculated by our code with the distorting potential set to zero with equivalent results generated by a more robust code that uses the conventional plane-wave Born approximation. Sample calculation results are presented for ionization of K- and L-shells of various elements and compared with the available experimental data.
Resumo:
Digital information generates the possibility of a high degree of redundancy in the data available for fitting predictive models used for Digital Soil Mapping (DSM). Among these models, the Decision Tree (DT) technique has been increasingly applied due to its capacity of dealing with large datasets. The purpose of this study was to evaluate the impact of the data volume used to generate the DT models on the quality of soil maps. An area of 889.33 km² was chosen in the Northern region of the State of Rio Grande do Sul. The soil-landscape relationship was obtained from reambulation of the studied area and the alignment of the units in the 1:50,000 scale topographic mapping. Six predictive covariates linked to the factors soil formation, relief and organisms, together with data sets of 1, 3, 5, 10, 15, 20 and 25 % of the total data volume, were used to generate the predictive DT models in the data mining program Waikato Environment for Knowledge Analysis (WEKA). In this study, sample densities below 5 % resulted in models with lower power of capturing the complexity of the spatial distribution of the soil in the study area. The relation between the data volume to be handled and the predictive capacity of the models was best for samples between 5 and 15 %. For the models based on these sample densities, the collected field data indicated an accuracy of predictive mapping close to 70 %.
Resumo:
Caspase cleaved amyloid precursor protein (APPcc) and SET are increased and mislocalized in the neuronal cytoplasm in Alzheimer Disease (AD) brains. Translocated SET to the cytoplasm can induce tau hyperphosphorylation. To elucidate the putative relationships between mislocalized APPcc and SET, we studied their level and distribution in the hippocampus of 5 controls, 3 Down syndrome and 10 Alzheimer patients. In Down syndrome and Alzheimer patients, APPcc and SET levels were increased in CA1 and the frequency of both localizations in the neuronal cytoplasm was high in CA1, and low in CA4. As the increase of APPcc is already present at early stages of AD, we overexpressed APPcc in CA1 and the dentate gyrus neurons of adult mice with a lentiviral construct. APPcc overexpression in CA1 and not in the dentate gyrus induced endogenous SET translocation and tau hyperphosphorylation. These data suggest that increase in APPcc in CA1 neurons could be an early event leading to the translocation of SET and the progression of AD through tau hyperphosphorylation.
Resumo:
A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.
Resumo:
This paper provides an axiomatic framework to compare the D-core (the set of undominatedimputations) and the core of a cooperative game with transferable utility. Theorem1 states that the D-core is the only solution satisfying projection consistency, reasonableness (from above), (*)-antimonotonicity, and modularity. Theorem 2 characterizes the core replacing (*)-antimonotonicity by antimonotonicity. Moreover, these axioms alsocharacterize the core on the domain of convex games, totally balanced games, balancedgames, and superadditive games
Resumo:
A static comparative study on set-solutions for cooperative TU games is carried out. The analysis focuses on studying the compatibility between two classical and reasonable properties introduced by Young (1985) in the context of single valued solutions, namely core-selection and coalitional monotonicity. As the main result, it is showed that coalitional monotonicity is not only incompatible with the core-selection property but also with the bargaining-selection property. This new impossibility result reinforces the tradeoff between these kinds of interesting and intuitive economic properties. Positive results about compatibility between desirable economic properties are given replacing the core selection requirement by the core-extension property.