950 resultados para NiPAT, code pattern analysis, object-oriented programming languages
Resumo:
Résumé : La texture dispose d’un bon potentiel discriminant qui complète celui des paramètres radiométriques dans le processus de classification d’image. L’indice Compact Texture Unit (CTU) multibande, récemment mis au point par Safia et He (2014), permet d’extraire la texture sur plusieurs bandes à la fois, donc de tirer parti d’un surcroît d’informations ignorées jusqu’ici dans les analyses texturales traditionnelles : l’interdépendance entre les bandes. Toutefois, ce nouvel outil n’a pas encore été testé sur des images multisources, usage qui peut se révéler d’un grand intérêt quand on considère par exemple toute la richesse texturale que le radar peut apporter en supplément à l’optique, par combinaison de données. Cette étude permet donc de compléter la validation initiée par Safia (2014) en appliquant le CTU sur un couple d’images optique-radar. L’analyse texturale de ce jeu de données a permis de générer une image en « texture couleur ». Ces bandes texturales créées sont à nouveau combinées avec les bandes initiales de l’optique, avant d’être intégrées dans un processus de classification de l’occupation du sol sous eCognition. Le même procédé de classification (mais sans CTU) est appliqué respectivement sur : la donnée Optique, puis le Radar, et enfin la combinaison Optique-Radar. Par ailleurs le CTU généré sur l’Optique uniquement (monosource) est comparé à celui dérivant du couple Optique-Radar (multisources). L’analyse du pouvoir séparateur de ces différentes bandes à partir d’histogrammes, ainsi que l’outil matrice de confusion, permet de confronter la performance de ces différents cas de figure et paramètres utilisés. Ces éléments de comparaison présentent le CTU, et notamment le CTU multisources, comme le critère le plus discriminant ; sa présence rajoute de la variabilité dans l’image permettant ainsi une segmentation plus nette, une classification à la fois plus détaillée et plus performante. En effet, la précision passe de 0.5 avec l’image Optique à 0.74 pour l’image CTU, alors que la confusion diminue en passant de 0.30 (dans l’Optique) à 0.02 (dans le CTU).
Resumo:
El concepto de actividad física es concebido de diferentes formas. Mostrando que existen varios factores que afectan de manera directa e indirecta la percepción que los sujetos construyen entorno a él, generando así una aproximación a diferentes definiciones de la actividad física desde varias perspectivas y dimensiones, donde predomina una noción netamente biológica. Este estudio pretende analizar, como desde las clases sociales se concibe la actividad física en sus conceptos y prácticas considerando los modelos de determinantes y determinación social para la salud. Con fin de comprender como los autores de la literatura científica conciben la actividad física y la relación con las clases sociales, desde una perspectiva teórica de los determinantes sociales de la salud y la teoría de la determinación social, se realizó una revisión documental y análisis de contenido de los conceptos y prácticas de la actividad física que se han considerado en los últimos 10 años. Para ello se seleccionaron las bases de datos PubMed y BVS (Biblioteca Virtual de Salud) por sus énfasis en publicaciones de salud mundialmente. Mostrando que la actividad física es concebida dominantemente desde una perspectiva biológica que ejerce una mirada reduccionista. Las relaciones entre actividad física y las clases sociales están claramente establecidas, sin embargo, estas relaciones pueden discrepar teniendo en cuenta el concepto de clase social, el contexto y la orientación de los autores y las poblaciones objetos de estudio. Obteniendo como resultado que los estudios documentados, revisados y analizados muestran una clara tendencia al modelo de determinantes; no obstante, algunos estudios en sus análisis se orientan hacia el modelo de determinación social. En cuanto al concepto de clases sociales los autores consideran una combinación de factores culturales y económicos sin atreverse a adoptar un concepto específico.
Resumo:
Every construction process (whatever buildings, machines, software, etc.) requires first to make a model of the artifact that is going to be built. This model should be based on a paradigm or meta-model, which defines the basic modeling elements: which real world concepts can be represented, which relationships can be established among them, and son on. There also should be a language to represent, manipulate and think about that model. Usually this model should be redefined at various levels of abstraction. So both, the paradigm an the language, must have abstraction capacity. In this paper I characterize the relationships that exist between these concepts: model, language and abstraction. I also analyze some historical models, like the relational model for databases, the imperative programming model and the object oriented model. Finally, I remark the need to teach that model-driven approach to students, and even go further to higher level models, like component models o business models.
Resumo:
Slot and van Emde Boas Invariance Thesis states that a time (respectively, space) cost model is reasonable for a computational model C if there are mutual simulations between Turing machines and C such that the overhead is polynomial in time (respectively, linear in space). The rationale is that under the Invariance Thesis, complexity classes such as LOGSPACE, P, PSPACE, become robust, i.e. machine independent. In this dissertation, we want to find out if it possible to define a reasonable space cost model for the lambda-calculus, the paradigmatic model for functional programming languages. We start by considering an unusual evaluation mechanism for the lambda-calculus, based on Girard's Geometry of Interaction, that was conjectured to be the key ingredient to obtain a space reasonable cost model. By a fine complexity analysis of this schema, based on new variants of non-idempotent intersection types, we disprove this conjecture. Then, we change the target of our analysis. We consider a variant over Krivine's abstract machine, a standard evaluation mechanism for the call-by-name lambda-calculus, optimized for space complexity, and implemented without any pointer. A fine analysis of the execution of (a refined version of) the encoding of Turing machines into the lambda-calculus allows us to conclude that the space consumed by this machine is indeed a reasonable space cost model. In particular, for the first time we are able to measure also sub-linear space complexities. Moreover, we transfer this result to the call-by-value case. Finally, we provide also an intersection type system that characterizes compositionally this new reasonable space measure. This is done through a minimal, yet non trivial, modification of the original de Carvalho type system.
Resumo:
A global italian pharmaceutical company has to provide two work environments that favor different needs. The environments will allow to develop solutions in a controlled, secure and at the same time in an independent manner on a state-of-the-art enterprise cloud platform. The need of developing two different environments is dictated by the needs of the working units. Indeed, the first environment is designed to facilitate the creation of application related to genomics, therefore, designed more for data-scientists. This environment is capable of consuming, producing, retrieving and incorporating data, furthermore, will support the most used programming languages for genomic applications (e.g., Python, R). The proposal was to obtain a pool of ready-togo Virtual Machines with different architectures to provide best performance based on the job that needs to be carried out. The second environment has more of a traditional trait, to obtain, via ETL (Extract-Transform-Load) process, a global datamodel, resembling a classical relational structure. It will provide major BI operations (e.g., analytics, performance measure, reports, etc.) that can be leveraged both for application analysis or for internal usage. Since, both architectures will maintain large amounts of data regarding not only pharmaceutical informations but also internal company informations, it would be possible to digest the data by reporting/ analytics tools and also apply data-mining, machine learning technologies to exploit intrinsic informations. The thesis work will introduce, proposals, implementations, descriptions of used technologies/platforms and future works of the above discussed environments.
Resumo:
To subjectively and objectively compare an accessible interactive electronic library using Moodle with lectures for urology teaching of medical students. Forty consecutive fourth-year medical students and one urology teacher were exposed to two teaching methods (4 weeks each) in the form of problem-based learning: - lectures and - student-centered group discussion based on Moodle (modular object-oriented dynamic learning environment) full time online delivered (24/7) with video surgeries, electronic urology cases and additional basic principles of the disease process. All 40 students completed the study. While 30% were moderately dissatisfied with their current knowledge base, online learning course delivery using Moodle was considered superior to the lectures by 86% of the students. The study found the following observations: (1) the increment in learning grades ranged from 7.0 to 9.7 for students in the online Moodle course compared to 4.0-9.6 to didactic lectures; (2) the self-reported student involvement in the online course was characterized as large by over 60%; (3) the teacher-student interaction was described as very frequent (50%) and moderately frequent (50%); and (4) more inquiries and requisitions by students as well as peer assisting were observed from the students using the Moodle platform. The Moodle platform is feasible and effective, enthusing medical students to learn, improving immersion in the urology clinical rotation and encouraging the spontaneous peer assisted learning. Future studies should expand objective evaluations of knowledge acquisition and retention.
Resumo:
The effects of chromium or nickel oxide additions on the composition of Portland clinker were investigated by X-ray powder diffraction associated with pattern analysis by the Rietveld method. The co-processing of industrial waste in Portland cement plants is an alternative solution to the problem of final disposal of hazardous waste. Industrial waste containing chromium or nickel is hazardous and is difficult to dispose of. It was observed that in concentrations up to 1% in mass, the chromium or nickel oxide additions do not cause significant alterations in Portland clinker composition. (C) 2008 International Centre for Diffraction Data.
Resumo:
This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.
Resumo:
Previous work on generating state machines for the purpose of class testing has not been formally based. There has also been work on deriving state machines from formal specifications for testing non-object-oriented software. We build on this work by presenting a method for deriving a state machine for testing purposes from a formal specification of the class under test. We also show how the resulting state machine can be used as the basis for a test suite developed and executed using an existing framework for class testing. To derive the state machine, we identify the states and possible interactions of the operations of the class under test. The Test Template Framework is used to formally derive the states from the Object-Z specification of the class under test. The transitions of the finite state machine are calculated from the derived states and the class's operations. The formally derived finite state machine is transformed to a ClassBench testgraph, which is used as input to the ClassBench framework to test a C++ implementation of the class. The method is illustrated using a simple bounded queue example.
Resumo:
This paper presents the multi-threading and internet message communication capabilities of Qu-Prolog. Message addresses are symbolic and the communications package provides high-level support that completely hides details of IP addresses and port numbers as well as the underlying TCP/IP transport layer. The combination of the multi-threads and the high level inter-thread message communications provide simple, powerful support for implementing internet distributed intelligent applications.
Resumo:
The magnitude of genotype-by-management (G x M) interactions for grain yield and grain protein concentration was examined in a multi-environment trial (MET) involving a diverse set of 272 advanced breeding lines from the Queensland wheat breeding program. The MET was structured as a series of management-regimes imposed at 3 sites for 2 years. The management-regimes were generated at each site-year as separate trials in which planting time, N fertiliser application rate, cropping history, and irrigation were manipulated. irrigation was used to simulate different rainfall regimes. From the combined analysis of variance, the G x M interaction variance components were found to be the largest source of G x E interaction variation for both grain yield (0.117 +/- 0.005 t(2) ha(-2); 49% of total G x E 0.238 +/- 0.028 t(2) ha(-2)) and grain protein concentration (0.445 +/- 0.020%(2); 82% of total G x E 0.546 +/- 0.057%(2)), and in both cases this source of variation was larger than the genotypic variance component (grain yield 0.068 +/- 0.014 t(2) ha(-2) and grain protein 0.203 +/- 0.026%(2)). The genotypic correlation between the traits varied considerably with management-regime, ranging from -0.98 to -0.31, with an estimate of 0.0 for one trial. Pattern analysis identified advanced breeding lines with improved grain yield and grain protein concentration relative to the cultivars Hartog, Sunco and Meteor. It is likely that a large component of the previously documented G x E interactions for grain yield of wheat in the northern grains region are in part a result of G x M interactions. The implications of the strong influence of G x M interactions for the conduct of wheat breeding METs in the northern region are discussed. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Diseases and insect pests are major causes of low yields of common bean (Phaseolus vulgaris L.) in Latin America and Africa. Anthracnose, angular leaf spot and common bacterial blight are widespread foliar diseases of common bean that also infect pods and seeds. One thousand and eighty-two accessions from a common bean core collection from the primary centres of origin were investigated for reaction to these three diseases. Angular leaf spot and common bacterial blight were evaluated in the field at Santander de Quilichao, Colombia, and anthracnose was evaluated in a screenhouse in Popayan, Colombia. By using the 15-group level from a hierarchical clustering procedure, it was found that 7 groups were formed with mainly Andean common bean accessions (Andean gene pool), 7 groups with mainly Middle American accessions (Middle American gene pool), while 1 group contained mixed accessions. Consistent with the theory of co-evolution, it was generally observed that accessions from the Andean gene pool were resistant to Middle American pathogen isolates causing anthracnoxe, while the Middle American accessions were resistant to pathogen isolates from the Andes. Different combinations of resistance patterns were found, and breeders can use this information to select a specific group of accessions on the basis of their need.
Resumo:
Localization of signaling complexes to specific micro-domains coordinates signal transduction at the plasma membrane. Using immunogold electron microscopy of plasma membrane sheets coupled with spatial point pattern analysis, we have visualized morphologically featureless microdomains including lipid rafts, in situ and at high resolution. We find that an inner-plasma membrane lipid raft marker displays cholesterol-dependent clustering in microdomains with a mean diameter of 44 nm that occupy 35% of the cell surface. Cross-linking an outer-leaflet raft protein results in the redistribution of inner leaflet rafts, but they retain their modular structure. Analysis of Ras microlocalization shows that inactive H-ras is distributed between lipid rafts and a cholesterol-independent micro-domain. Conversely, activated H-ras and K-ras reside predominantly in nonoverlapping, cholesterol-independent microdomains. Galectin-1 stabilizes the association of activated H-ras with these nonraft microdomains, whereas K-ras clustering is supported by farnesylation, but not geranylgeranylation. These results illustrate that the inner plasma membrane comprises a complex mosaic of discrete microdomains. Differential spatial localization within this framework can likely account for the distinct signal outputs from the highly homologous Ras proteins.
Resumo:
Forest cover of the Maringá municipality, located in northern Parana State, was mapped in this study. Mapping was carried out by using high-resolution HRC sensor imagery and medium resolution CCD sensor imagery from the CBERS satellite. Images were georeferenced and forest vegetation patches (TOFs - trees outside forests) were classified using two methods of digital classification: reflectance-based or the digital number of each pixel, and object-oriented. The areas of each polygon were calculated, which allowed each polygon to be segregated into size classes. Thematic maps were built from the resulting polygon size classes and summary statistics generated from each size class for each area. It was found that most forest fragments in Maringá were smaller than 500 m². There was also a difference of 58.44% in the amount of vegetation between the high-resolution imagery and medium resolution imagery due to the distinct spatial resolution of the sensors. It was concluded that high-resolution geotechnology is essential to provide reliable information on urban greens and forest cover under highly human-perturbed landscapes.