841 resultados para Stillar, Glenn F.: Analyzing everyday texts
Resumo:
This paper develops a Capability Matrix for analyzing capabilities of developing country firms that participate in global and national value chains. This is a generic framework to capture firm-level knowledge accumulation in the context of global and local industrial constellations, by integrating key elements of the global value chain (GVC) and technological capabilities (TC) approaches. The framework can visually portray characteristics of firms’ capabilities, and highlight a relatively overlooked factor in the GVC approach: local firms’ endogenous learning efforts in varieties of relationship with lead firms.
Resumo:
Cultural content on the Web is available in various domains (cultural objects, datasets, geospatial data, moving images, scholarly texts and visual resources), concerns various topics, is written in different languages, targeted to both laymen and experts, and provided by different communities (libraries, archives museums and information industry) and individuals (Figure 1). The integration of information technologies and cultural heritage content on the Web is expected to have an impact on everyday life from the point of view of institutions, communities and individuals. In particular, collaborative environment scan recreate 3D navigable worlds that can offer new insights into our cultural heritage (Chan 2007). However, the main barrier is to find and relate cultural heritage information by end-users of cultural contents, as well as by organisations and communities managing and producing them. In this paper, we explore several visualisation techniques for supporting cultural interfaces, where the role of metadata is essential for supporting the search and communication among end-users (Figure 2). A conceptual framework was developed to integrate the data, purpose, technology, impact, and form components of a collaborative environment, Our preliminary results show that collaborative environments can help with cultural heritage information sharing and communication tasks because of the way in which they provide a visual context to end-users. They can be regarded as distributed virtual reality systems that offer graphically realised, potentially infinite, digital information landscapes. Moreover, collaborative environments also provide a new way of interaction between an end-user and a cultural heritage data set. Finally, the visualisation of metadata of a dataset plays an important role in helping end-users in their search for heritage contents on the Web.
Resumo:
Nowadays, a wide offer of mobile augmented reality (mAR) applications is available at the market, and the user base of mobile AR-capable devices -smartphones- is rapidly increasing. Nevertheless, likewise to what happens in other mobile segments, business models to put mAR in value are not clearly defined yet. In this paper, we focus on sketching the big picture of the commercial offer of mAR applications, in order to inspire a posterior analysis of business models that may successfully support the evolution of mAR. We have gathered more than 400 mAR applications from Android Market, and analyzed the offer as a whole, taking into account some technology aspects, pricing schemes and user adoption factors. Results show, for example, that application providers are not expecting to generate revenues per direct download, although they are producing high-quality applications, well rated by the users.
Resumo:
We present a novel approach using both sustained vowels and connected speech, to detect obstructive sleep apnea (OSA) cases within a homogeneous group of speakers. The proposed scheme is based on state-of-the-art GMM-based classifiers, and acknowledges specifically the way in which acoustic models are trained on standard databases, as well as the complexity of the resulting models and their adaptation to specific data. Our experimental database contains a suitable number of utterances and sustained speech from healthy (i.e control) and OSA Spanish speakers. Finally, a 25.1% relative reduction in classification error is achieved when fusing continuous and sustained speech classifiers. Index Terms: obstructive sleep apnea (OSA), gaussian mixture models (GMMs), background model (BM), classifier fusion.
Resumo:
This paper describes the participation of DAEDALUS at the LogCLEF lab in CLEF 2011. This year, the objectives of our participation are twofold. The first topic is to analyze if there is any measurable effect on the success of the search queries if the native language and the interface language chosen by the user are different. The idea is to determine if this difference may condition the way in which the user interacts with the search application. The second topic is to analyze the user context and his/her interaction with the system in the case of successful queries, to discover out any relation among the user native language, the language of the resource involved and the interaction strategy adopted by the user to find out such resource. Only 6.89% of queries are successful out of the 628,607 queries in the 320,001 sessions with at least one search query in the log. The main conclusion that can be drawn is that, in general for all languages, whether the native language matches the interface language or not does not seem to affect the success rate of the search queries. On the other hand, the analysis of the strategy adopted by users when looking for a particular resource shows that people tend to use the simple search tool, frequently first running short queries build up of just one specific term and then browsing through the results to locate the expected resource
Resumo:
Traditional logic programming languages, such as Prolog, use a fixed left-to-right atom scheduling rule. Recent logic programming languages, however, usually provide more flexible scheduling in which computation generally proceeds leftto- right but in which some calis are dynamically "delayed" until their arguments are sufRciently instantiated to allow the cali to run efficiently. Such dynamic scheduling has a significant cost. We give a framework for the global analysis of logic programming languages with dynamic scheduling and show that program analysis based on this framework supports optimizations which remove much of the overhead of dynamic scheduling.
Resumo:
Software testing is a key aspect of software reliability and quality assurance in a context where software development constantly has to overcome mammoth challenges in a continuously changing environment. One of the characteristics of software testing is that it has a large intellectual capital component and can thus benefit from the use of the experience gained from past projects. Software testing can, then, potentially benefit from solutions provided by the knowledge management discipline. There are in fact a number of proposals concerning effective knowledge management related to several software engineering processes. Objective: We defend the use of a lesson learned system for software testing. The reason is that such a system is an effective knowledge management resource enabling testers and managers to take advantage of the experience locked away in the brains of the testers. To do this, the experience has to be gathered, disseminated and reused. Method: After analyzing the proposals for managing software testing experience, significant weaknesses have been detected in the current systems of this type. The architectural model proposed here for lesson learned systems is designed to try to avoid these weaknesses. This model (i) defines the structure of the software testing lessons learned; (ii) sets up procedures for lesson learned management; and (iii) supports the design of software tools to manage the lessons learned. Results: A different approach, based on the management of the lessons learned that software testing engineers gather from everyday experience, with two basic goals: usefulness and applicability. Conclusion: The architectural model proposed here lays the groundwork to overcome the obstacles to sharing and reusing experience gained in the software testing and test management. As such, it provides guidance for developing software testing lesson learned systems.
Resumo:
Many diseases have a genetic origin, and a great effort is being made to detect the genes that are responsible for their insurgence. One of the most promising techniques is the analysis of genetic information through the use of complex networks theory. Yet, a practical problem of this approach is its computational cost, which scales as the square of the number of features included in the initial dataset. In this paper, we propose the use of an iterative feature selection strategy to identify reduced subsets of relevant features, and show an application to the analysis of congenital Obstructive Nephropathy. Results demonstrate that, besides achieving a drastic reduction of the computational cost, the topologies of the obtained networks still hold all the relevant information, and are thus able to fully characterize the severity of the disease.
Resumo:
Background: There are 600,000 new malaria cases daily worldwide. The gold standard for estimating the parasite burden and the corresponding severity of the disease consists in manually counting the number of parasites in blood smears through a microscope, a process that can take more than 20 minutes of an expert microscopist’s time. Objective: This research tests the feasibility of a crowdsourced approach to malaria image analysis. In particular, we investigated whether anonymous volunteers with no prior experience would be able to count malaria parasites in digitized images of thick blood smears by playing a Web-based game. Methods: The experimental system consisted of a Web-based game where online volunteers were tasked with detecting parasites in digitized blood sample images coupled with a decision algorithm that combined the analyses from several players to produce an improved collective detection outcome. Data were collected through the MalariaSpot website. Random images of thick blood films containing Plasmodium falciparum at medium to low parasitemias, acquired by conventional optical microscopy, were presented to players. In the game, players had to find and tag as many parasites as possible in 1 minute. In the event that players found all the parasites present in the image, they were presented with a new image. In order to combine the choices of different players into a single crowd decision, we implemented an image processing pipeline and a quorum algorithm that judged a parasite tagged when a group of players agreed on its position. Results: Over 1 month, anonymous players from 95 countries played more than 12,000 games and generated a database of more than 270,000 clicks on the test images. Results revealed that combining 22 games from nonexpert players achieved a parasite counting accuracy higher than 99%. This performance could be obtained also by combining 13 games from players trained for 1 minute. Exhaustive computations measured the parasite counting accuracy for all players as a function of the number of games considered and the experience of the players. In addition, we propose a mathematical equation that accurately models the collective parasite counting performance. Conclusions: This research validates the online gaming approach for crowdsourced counting of malaria parasites in images of thick blood films. The findings support the conclusion that nonexperts are able to rapidly learn how to identify the typical features of malaria parasites in digitized thick blood samples and that combining the analyses of several users provides similar parasite counting accuracy rates as those of expert microscopists. This experiment illustrates the potential of the crowdsourced gaming approach for performing routine malaria parasite quantification, and more generally for solving biomedical image analysis problems, with future potential for telediagnosis related to global health challenges.
Resumo:
Purpose – Reducing energy consumption in walking robots is an issue of great importance in field applications such as humanitarian demining so as to increase mission time for a given power supply. The purpose of this paper is to address the problem of improving energy efficiency in statically stable walking machines by comparing two leg, insect and mammal, configurations on the hexapod robotic platform SILO6. Design/methodology/approach – Dynamic simulation of this hexapod is used to develop a set of rules that optimize energy expenditure in both configurations. Later, through a theoretical analysis of energy consumption and experimental measurements in the real platform SILO6, a configuration is chosen. Findings – It is widely accepted that the mammal configuration in statically stable walking machines is better for supporting high loads, while the insect configuration is considered to be better for improving mobility. However, taking into account the leg dynamics and not only the body weight, different results are obtained. In a mammal configuration, supporting body weight accounts for 5 per cent of power consumption while leg dynamics accounts for 31 per cent. Originality/value – As this paper demonstrates, the energy expended when the robot walks along a straight and horizontal line is the same for both insect and mammal configurations, while power consumption during crab walking in an insect configuration exceeds power consumption in the mammal configuration.
Resumo:
Storm evolution is fundamental for analysing the damage progression of the different failure modes and establishing suitable protocols for maintaining and optimally sizing structures. However, this aspect has hardly been studied and practically the whole of the studies dealing with the subject adopt the Equivalent triangle storm. As against this approach, two new ones are proposed. The first is the Equivalent Triangle Magnitude Storm model (ETMS), whose base, the triangular storm duration, D, is established such that its magnitude (area describing the storm history above the reference threshold level which sets the storm condition),HT, equals the real storm magnitude. The other is the Equivalent Triangle Number of Waves Storm (ETNWS), where the base is referred in terms of the real storm's number of waves,Nz. Three approaches are used for estimating the mean period, Tm, associated to each of the sea states defining the storm evolution, which is necessary to determine the full energy flux withstood by the structure in the course of the extreme event. Two are based on the Jonswap spectrum representativity and the other uses the bivariate Gumbel copula (Hs, Tm), resulting from adjusting the storm peaks. The representativity of the approaches proposed and those defined in specialised literature are analysed by comparing the main armour layer's progressive loss of hydraulic stability caused by real storms and that relating to theoretical ones. An empirical maximum energy flux model is used for this purpose. The agreement between the empirical and theoretical results demonstrates that the representativity of the different approaches depends on the storm characteristics and point towards a need to investigate other geometrical shapes to characterise the storm evolution associated with sea states heavily influenced by swell wave components.
Resumo:
As part of the Mediterranean area, the Guadiana basin in Spain is particularly exposed to increasing water stress due to climate change. Future warmer and drier climate will have negative implications for the sustainability of water resources and irrigation agriculture, the main socio- economic sector in the region. This paper illustrates a systematic analysis of climate change impacts and adaptation in the Guadiana basin based on a two-stage modeling approach. First, an integrated hydro-economic modeling framework was used to simulate the potential effects of regional climate change scenarios for the period 2000-2069. Second, a participatory multi-criteria technique, namely the Analytic Hierarchy Process (AHP), was applied to rank potential adaptation measures based on agreed criteria. Results show that, in the middle-long run and under severe climate change, reduced water availability, lower crop yields and increased irrigation demands might lead to water shortages, crop failure, and up to ten percent of income losses to irrigators. AHP results show how private farming adaptation measures, including improving irrigation efficiency and adjusting crop varieties, are preferred to public adaptation measures, such as building new dams. The integrated quantitative and qualitative methodology used in this research can be considered a socially-based valuable tool to support adaptation decision-making.
Resumo:
1. Introduction: setting and problem definition 2. The Adaptation Pathway –2.1 Stage 1: appraising risks and opportunities •Step 1: Impact analysis •Step 2: Policy analysis •Step 3: Socio-institutional analysis –2.2 Stage 2: appraising and choosing adaptation opt ions •Step 4: identifying and prioritizing adaptation o ptions 3. Conclusions