56 resultados para Calculated based on Forel-Ule scale, FU21
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
Modeling ecological niches of species is a promising approach for predicting the geographic potential of invasive species in new environments. Argentine ants (Linepithema humile) rank among the most successful invasive species: native to South America, they have invaded broad areas worldwide. Despite their widespread success, little is known about what makes an area susceptible - or not - to invasion. Here, we use a genetic algorithm approach to ecological niche modeling based on high-resolution remote-sensing data to examine the roles of niche similarity and difference in predicting invasions by this species. Our comparisons support a picture of general conservatism of the species' ecological characteristics, in spite of distinct geographic and community contexts
Resumo:
Shape complexity has recently received attention from different fields, such as computer vision and psychology. In this paper, integral geometry and information theory tools are applied to quantify the shape complexity from two different perspectives: from the inside of the object, we evaluate its degree of structure or correlation between its surfaces (inner complexity), and from the outside, we compute its degree of interaction with the circumscribing sphere (outer complexity). Our shape complexity measures are based on the following two facts: uniformly distributed global lines crossing an object define a continuous information channel and the continuous mutual information of this channel is independent of the object discretisation and invariant to translations, rotations, and changes of scale. The measures introduced in this paper can be potentially used as shape descriptors for object recognition, image retrieval, object localisation, tumour analysis, and protein docking, among others
Resumo:
The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies
Resumo:
Conventional methods of gene prediction rely on the recognition of DNA-sequence signals, the coding potential or the comparison of a genomic sequence with a cDNA, EST, or protein database. Reasons for limited accuracy in many circumstances are species-specific training and the incompleteness of reference databases. Lately, comparative genome analysis has attracted increasing attention. Several analysis tools that are based on human/mouse comparisons are already available. Here, we present a program for the prediction of protein-coding genes, termed SGP-1 (Syntenic Gene Prediction), which is based on the similarity of homologous genomic sequences. In contrast to most existing tools, the accuracy of SGP-1 depends little on species-specific properties such as codon usage or the nucleotide distribution. SGP-1 may therefore be applied to nonstandard model organisms in vertebrates as well as in plants, without the need for extensive parameter training. In addition to predicting genes in large-scale genomic sequences, the program may be useful to validate gene structure annotations from databases. To this end, SGP-1 output also contains comparisons between predicted and annotated gene structures in HTML format. The program can be accessed via a Web server at http://soft.ice.mpg.de/sgp-1. The source code, written in ANSI C, is available on request from the authors.
Resumo:
A new parameter is introduced: the lightning potential index (LPI), which is a measure of the potential for charge generation and separation that leads to lightning flashes in convective thunderstorms. The LPI is calculated within the charge separation region of clouds between 0 C and 20 C, where the noninductive mechanism involving collisions of ice and graupel particles in the presence of supercooled water is most effective. As shown in several case studies using the Weather Research and Forecasting (WRF) model with explicit microphysics, the LPI is highly correlated with observed lightning. It is suggested that the LPI may be a useful parameter for predicting lightning as well as a tool for improving weather forecasting of convective storms and heavy rainfall.
Resumo:
Interaction models of atomic Al with Si4H9, Si4H7, and Si6H9 clusters have been studied to simulate Al chemisorption on the Si(111) surface in the atop, fourfold atop, and open sites. Calculations were carried out using nonempirical pseudopotentials in the framework of the ab initio Hartree-Fock procedure. Equilibrium bond distances, binding energies for adsorption, and vibrational frequencies of the adatoms are calculated. Several basis sets were used in order to show the importance of polarization effects, especially in the binding energies. Final results show the importance of considering adatom-induced relaxation effects to specify the order of energy stabilities for the three different sites, the fourfold atop site being the preferred one, in agreement with experimental findings.
Resumo:
Aim of study: To identify species of wood samples based on common names and anatomical analyses of their transversal surfaces (without microscopic preparations). Area of study: Spain and South America Material and methods: The test was carried out on a batch of 15 lumber samples deposited in the Royal Botanical Garden in Madrid, from the expedition by Ruiz and Pavon (1777-1811). The first stage of the methodology is to search and to make a critical analysis of the databases which list common nomenclature along with scientific nomenclature. A geographic filter was then applied to the information resulting from the samples with a more restricted distribution. Finally an anatomical verification was carried out with a pocket microscope with a magnification of x40, equipped with a 50 micrometers resolution scale. Main results: The identification of the wood based exclusively on the common name is not useful due to the high number of alternative possibilities (14 for “naranjo”, 10 for “ébano”, etc.). The common name of one of the samples (“huachapelí mulato”) enabled the geographic origin of the samples to be accurately located to the shipyard area in Guayaquil (Ecuador). Given that Ruiz y Pavon did not travel to Ecuador, the specimens must have been obtained by Tafalla. It was possible to determine correctly 67% of the lumber samples from the batch. In 17% of the cases the methodology did not provide a reliable identification. Research highlights: It was possible to determine correctly 67% of the lumber samples from the batch and their geographic provenance. The identification of the wood based exclusively on the common name is not useful.
Resumo:
In this paper the authors propose a new closed contour descriptor that could be seen as a Feature Extractor of closed contours based on the Discrete Hartley Transform (DHT), its main characteristic is that uses only half of the coefficients required by Elliptical Fourier Descriptors (EFD) to obtain a contour approximation with similar error measure. The proposed closed contour descriptor provides an excellent capability of information compression useful for a great number of AI applications. Moreover it can provide scale, position and rotation invariance, and last but not least it has the advantage that both the parameterization and the reconstructed shape from the compressed set can be computed very efficiently by the fast Discrete Hartley Transform (DHT) algorithm. This Feature Extractor could be useful when the application claims for reversible features and when the user needs and easy measure of the quality for a given level of compression, scalable from low to very high quality.
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
Background: Assessing of the costs of treating disease is necessary to demonstrate cost-effectiveness and to estimate the budget impact of new interventions and therapeutic innovations. However, there are few comprehensive studies on resource use and costs associated with lung cancer patients in clinical practice in Spain or internationally. The aim of this paper was to assess the hospital cost associated with lung cancer diagnosis and treatment by histology, type of cost and stage at diagnosis in the Spanish National Health Service. Methods: A retrospective, descriptive analysis on resource use and a direct medical cost analysis were performed. Resource utilisation data were collected by means of patient files from nine teaching hospitals. From a hospital budget impact perspective, the aggregate and mean costs per patient were calculated over the first three years following diagnosis or up to death. Both aggregate and mean costs per patient were analysed by histology, stage at diagnosis and cost type. Results: A total of 232 cases of lung cancer were analysed, of which 74.1% corresponded to non-small cell lung cancer (NSCLC) and 11.2% to small cell lung cancer (SCLC); 14.7% had no cytohistologic confirmation. The mean cost per patient in NSCLC ranged from 13,218 Euros in Stage III to 16,120 Euros in Stage II. The main cost components were chemotherapy (29.5%) and surgery (22.8%). Advanced disease stages were associated with a decrease in the relative weight of surgical and inpatient care costs but an increase in chemotherapy costs. In SCLC patients, the mean cost per patient was 15,418 Euros for limited disease and 12,482 Euros for extensive disease. The main cost components were chemotherapy (36.1%) and other inpatient costs (28.7%). In both groups, the Kruskall-Wallis test did not show statistically significant differences in mean cost per patient between stages. Conclusions: This study provides the costs of lung cancer treatment based on patient file reviews, with chemotherapy and surgery accounting for the major components of costs. This cost analysis is a baseline study that will provide a useful source of information for future studies on cost-effectiveness and on the budget impact of different therapeutic innovations in Spain.
Resumo:
A theory of network-entrepreneurs or "spin-off system" is presented in this paper for the creation of firms based on the community’s social governance. It is argued that firm’s capacity for accumulation depends on the presence of employees belonging to the same social/ethnic group with expectations of "inheriting" the firm and becoming entrepreneurs once they have been selected for their merits and loyalty towards their patrons. Such accumulation is possible because of the credibility of the patrons’ promises of supporting newcomers due to high social cohesion and specific social norms prevailing in the community. This theory is exemplified through the case of the Barcelonnettes, a group of immigrants from the Alps in the South of France (Provence) who came to Mexico in the XIX Century.
Resumo:
The public perception of the EU in Spain varies greatly. The most positive aspects of Spanish membership are associated with the consolidation of democracy, economic growth, the introduction of the euro, the growth in employment and structural and cohesion funds, the increase in the female participation rate, and the equal opportunities policies. The analysts are in favour of common objectives in the employment policy and multi-level government. The less positive aspects of the EU are the risks of losing social protection and loss of employment in some sectors due to mergers of multinationals and delocalization of companies towards Eastern Europe. The continuous demands for reform of the welfare state, the toughening of the conditions of access to social benefit and the reform of the labour market are also seen as problematic issues. Risks of competitive cuts and social dumping.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.