604 resultados para Workflows semânticos
Resumo:
Monitoring the environment with acoustic sensors is an effective method for understanding changes in ecosystems. Through extensive monitoring, large-scale, ecologically relevant, datasets can be produced that can inform environmental policy. The collection of acoustic sensor data is a solved problem; the current challenge is the management and analysis of raw audio data to produce useful datasets for ecologists. This paper presents the applied research we use to analyze big acoustic datasets. Its core contribution is the presentation of practical large-scale acoustic data analysis methodologies. We describe details of the data workflows we use to provide both citizen scientists and researchers practical access to large volumes of ecoacoustic data. Finally, we propose a work in progress large-scale architecture for analysis driven by a hybrid cloud-and-local production-grade website.
Resumo:
Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.
Resumo:
Objective: The aim of this study was to develop a model capable of predicting variability in the mental workload experienced by frontline operators under routine and nonroutine conditions. Background: Excess workload is a risk that needs to be managed in safety-critical industries. Predictive models are needed to manage this risk effectively yet are difficult to develop. Much of the difficulty stems from the fact that workload prediction is a multilevel problem. Method: A multilevel workload model was developed in Study 1 with data collected from an en route air traffic management center. Dynamic density metrics were used to predict variability in workload within and between work units while controlling for variability among raters. The model was cross-validated in Studies 2 and 3 with the use of a high-fidelity simulator. Results: Reported workload generally remained within the bounds of the 90% prediction interval in Studies 2 and 3. Workload crossed the upper bound of the prediction interval only under nonroutine conditions. Qualitative analyses suggest that nonroutine events caused workload to cross the upper bound of the prediction interval because the controllers could not manage their workload strategically. Conclusion: The model performed well under both routine and nonroutine conditions and over different patterns of workload variation. Application: Workload prediction models can be used to support both strategic and tactical workload management. Strategic uses include the analysis of historical and projected workflows and the assessment of staffing needs. Tactical uses include the dynamic reallocation of resources to meet changes in demand.
Resumo:
Introduction Electronic medication administration record (eMAR) systems are promoted as a potential intervention to enhance medication safety in residential aged care facilities (RACFs). The purpose of this study was to conduct an in-practice evaluation of an eMAR being piloted in one Australian RACF before its roll out, and to provide recommendations for system improvements. Methods A multidisciplinary team conducted direct observations of workflow (n=34 hours) in the RACF site and the community pharmacy. Semi-structured interviews (n=5) with RACF staff and the community pharmacist were conducted to investigate their views of the eMAR system. Data were analysed using a grounded theory approach to identify challenges associated with the design of the eMAR system. Results The current eMAR system does not offer an end-to-end solution for medication management. Many steps, including prescribing by doctors and communication with the community pharmacist, are still performed manually using paper charts and fax machines. Five major challenges associated with the design of eMAR system were identified: limited interactivity; inadequate flexibility; problems related to information layout and semantics; the lack of relevant decision support; and system maintenance issues.We suggest recommendations to improve the design of the eMAR system and to optimize existing workflows. Discussion Immediate value can be achieved by improving the system interactivity, reducing inconsistencies in data entry design and offering dedicated organisational support to minimise connectivity issues. Longer-term benefits can be achieved by adding decision support features and establishing system interoperability requirements with stakeholder groups (e.g. community pharmacies) prior to system roll out. In-practice evaluations of technologies like eMAR system have great value in identifying design weaknesses which inhibit optimal system use.
Resumo:
Health challenges present arguably the most significant barrier to sustainable global development. The introduction of ICT in healthcare, especially the application of mobile communications, has created the potential to transform healthcare delivery by making it more accessible, affordable and effective across the developing world. However, current research into the assessment of mHealth from the perspective of developing countries particularly with community Health workers (CHWs) as primary users continues to be limited. The aim of this study is to analyze the contribution of mHealth in enhancing the performance of the health workers and its alignment with existing workflows to guide its utilization. The proposed research takes into account this consideration and aims to examine the task-technology alignment of mHealth for CHWs drawing upon the task technology fit as the theoretical foundation.
De "de" : Estudio histórico-comparativo de los usos y la semántica de la preposición "de" en español
Resumo:
El presente estudio supone un intento de describir y analizar el uso de la preposición "de" sobre la base de un corpus diacrónico, con énfasis en las diferentes relaciones semánticas que establece. Partiendo de un total de más de 16.000 casos de "de" hemos establecido 48 categorías de uso, que corresponden a cuatro tipos de construcción sintáctica, a saber, el uso de "de" como complemento de nombres (CN), verbos (CV), adjetivos (CA) y, finalmente, su uso como núcleo de expresiones adverbiales independientes (CI). El estudio consta de tres partes fundamentales. En la parte I, se introduce la Lingüística Cognitiva, que constituye la base teórica esencial del trabajo. Más exactamente, se introducen conceptos como la teoría del prototipo, la teoría de las metáforas conceptuales y la gramática cognitiva, especialmente las ideas de "punto de referencia" y "relación intrínseca" (Langacker 1995, 1999). La parte II incluye el análisis de las 48 categorías. En esta parte se presentan y comentan casi 2.000 ejemplos del uso contextual de "de" extraídos del corpus diacrónico. Los resultados más importantes del análisis pueden resumirse en los siguientes puntos: El uso de "de" sigue siendo esencialmente el mismo en la actualidad que hace 800 años, en el sentido de que todas las 48 categorías se identifican en todas las épocas del corpus. El uso de "de" como complemento nominal va aumentando, al contrario de lo que ocurre con su uso como complemento verbal. En el contexto nominal son especialmente las relaciones posesivas más abstractas las que se hacen más frecuentes, mientras que en el contexto verbal las relaciones que se hacen menos frecuentes son las de separación/alejamiento, causa, agente y partitivo indefinido. Destaca la importancia del siglo XVIII como época de transición entre un primer estado de las cosas y otro posterior, en especial en relación con el carácter cada vez más abstracto de las relaciones posesivas así como con la disminución de las categorías adverbales de causa, agente y partitivo. Pese a la variación en el contexto inmediato de uso, el núcleo semántico de "de" se mantiene inalterado. La parte III toma como punto de partida los resultados del análisis de la parte II, tratando de deslindar el aporte semántico de la preposición "de" a su contexto de uso del valor de la relación en conjunto. Así, recurriendo a la metodología para determinar el significado básico y la metodología para determinar lo que constituyen significados distintos de una preposición (Tyler , Evans 2003a, 2003b), se llega a la hipótesis de que "de" posee cuatro significados básicos, a saber, 'punto de partida', 'tema/asunto', 'parte/todo' y 'posesión'. Esta hipótesis, basada en las metodologías de Tyler y Evans y en los resultados del análisis de corpus, se intenta verificar empíricamente mediante el uso de dos cuestionarios destinados a averiguar hasta qué punto las distinciones semánticas a las que se llega por vía teórica son reconocidas por los hablantes nativos de la lengua (cf. Raukko 2003). El resultado conjunto de los dos acercamientos tanto refuerza como especifica la hipótesis. Los datos que arroja el análisis de los cuestionarios parecen reforzar la idea de que el núcleo semántico de "de" es complejo, constando de los cuatro valores mencionados. Sin embargo, cada uno de estos valores básicos constituye un prototipo local, en torno al cual se construye un complejo de matices semánticos derivados del prototipo. La idea final es que los hablantes son conscientes de los cuatro postulados valores básicos, pero que también distinguen matices más detallados, como son las ideas de 'causa', 'agente', 'instrumento', 'finalidad', 'cualidad', etc. Es decir, "de" constituye un elemento polisémico complejo cuya estructura semántica puede describirse como una semejanza de familia centrada en cuatro valores básicos en torno a los cuales se encuentra una serie de matices más específicos, que también constituyen valores propios de la preposición. Creemos, además, que esta caracterización semántica es válida para todas las épocas de la historia del español, con unas pequeñas modificaciones en el peso relativo de los distintos matices, lo cual está relacionado con la observada variación diacrónica en el uso de "de".
Resumo:
Scalable stream processing and continuous dataflow systems are gaining traction with the rise of big data due to the need for processing high velocity data in near real time. Unlike batch processing systems such as MapReduce and workflows, static scheduling strategies fall short for continuous dataflows due to the variations in the input data rates and the need for sustained throughput. The elastic resource provisioning of cloud infrastructure is valuable to meet the changing resource needs of such continuous applications. However, multi-tenant cloud resources introduce yet another dimension of performance variability that impacts the application's throughput. In this paper we propose PLAStiCC, an adaptive scheduling algorithm that balances resource cost and application throughput using a prediction-based lookahead approach. It not only addresses variations in the input data rates but also the underlying cloud infrastructure. In addition, we also propose several simpler static scheduling heuristics that operate in the absence of accurate performance prediction model. These static and adaptive heuristics are evaluated through extensive simulations using performance traces obtained from Amazon AWS IaaS public cloud. Our results show an improvement of up to 20% in the overall profit as compared to the reactive adaptation algorithm.
Resumo:
Contenido: La política lingüística en el renacimiento español / Lidio Nieto Jiménez – Sujeto y objeto textuales en “En la masmédula” de Oliverio Girondo / María Amelia Arancet – Violencia, ethica y resemantización mítica en el canto XII del Infierno / Daniel Capano – Entre la voz y la memoria. Lírica de tipo tradicional, palabra y música / Teresa H. de Tresca – Espejo e interlocución en “Nubosidad variable” de Carmen Martín Gaite / Teresa Iris Giovacchini – La poesía arbórea de Enrique Banchs / Javier Roberto González – El ideario espacial de “El ingenioso hidalgo Don Quijote de la Mancha / Silvia Cristina Lastra Paz – El simbolismo del fenómeno de la reflexión acústica y luminosa en las “Églogas” de Virgilio (segunda parte) / Aquilino Suárez Pallasá – Asedios estructurales y semánticos a dos poetas argentinos / Thorpe Running – Una adaptación teatral infantil de una novela de Emilio Salgari / Nora Lía Sormani – Reseñas bibliográficas
Resumo:
Resumen: El artículo examina el concepto de persona, en sus fuentes aporéticas y en el pensamiento de Santo Tomás de Aquino para, finalmente, analizar algunas de sus propiedades en orden a fundamentar una teoría de la imputación. La primera parte comienza con la consideración de los orígenes semánticos, históricos y teológicos de los términos persona e hypóstasis. Trata luego el problema teórico que este concepto implica para el pensamiento moderno, teniendo en cuenta el empobrecimiento de su Metafísica, como consecuencia del nominalismo, el principio de inmanencia y una deficiente concepción de la experiencia. Por último, se toman en consideración algunas consecuencias que dicho empobrecimiento tienen en el pensamiento contemporáneo. La segunda parte está dedicada a exponer la doctrina tomista que de un modo teóricamente definitivo dio solución, con elementos ontológicos aristotélicos, a los problemas planteados en la época Patrística. Se analiza el concepto de sustancia individual (o suppositum), que en la definición de persona opera analógicamente como género próximo, y el de naturaleza espiritual individuada, que opera como diferencia específica. Finalmente, se aportan textos del Aquinate acerca de la definición y de las diferencias conceptuales y ontológicas entre naturaleza y persona. En la tercera parte se exponen algunas propiedades, dividiéndolas en cuatro grupos de tesis: 1°) la persona como sujeto de atribución y dueña de sus proyectos vitales; 2°) como sujeto ontológicamente abierto al mundo, a los semejantes y a Dios; 3°) como sujeto consciente, libre y autónomo; 4°) el carácter ético de la persona, como sujeto de imputación y responsabilidad.
Resumo:
Apresenta uma proposta para a melhoria contínua de processo de software. Fundamentado nas recomendações das normas e modelos de qualidade, a partir da convicção de que qualquer processo, independente da maturidade, necessita evoluir de uma forma organizada em função das metas estabelecidas pela alta direção e sugestões de melhoria e inovações. Inclui os papéis, o fluxo de trabalho, as atividades a serem executadas, as ferramentas empregadas e os indicadores a serem coletados e avaliados.
Resumo:
210 p. : graf.
Resumo:
A look at best practice, good practice and administrative workflows
Resumo:
329 p.
Resumo:
O objetivo desta pesquisa é analisar as marcas linguísticas que compõem as relações de sentido no que diz respeito ao humor da crônica de Aldir Blanc. A partir da percepção da estrutura da Língua Portuguesa, verificar quais as ações na produção textual da crônica sejam estas morfológicas, gramaticais, lexicais, sintáticas, fonéticas, estilísticas ou semânticas que, ao aliar-se à práxis cotidiana (contexto) oferecida pelos jornais, propiciam o discurso humorístico do autor. Os pressupostos teóricos que corroboram as respectivas demarcações linguísticas terão ênfase no tocante aos aspectos semânticos, com Stephen Ullman, Pierre Guiraud e Edward Lopes. Os traços característicos da crônica, o humor e a ironia como recursos discursivos, a seleção lexical, as inferências, o jogo lúdico da grafia das palavras, as formas textuais serão inferidos dialogicamente em grande parte pelos estudos estilísticos de Marcel Cressot, Pierre Guiraud, Mattoso Câmara, Nilce SantAnna Martins e José Lemos Monteiro. Desse modo, apontamos para um estudo da língua sob uma ótica semântico-discursiva, levando em conta sua expressividade estilística. Dessa forma, à valorização do manuseamento linguístico, quer pela escritura, quer pela leitura, acrescenta-se o dado crítico da língua através do discurso presente no humor, tendo em vista o gênero textual que é a crônica e sua grande penetração social pelos jornais de grande circulação
Resumo:
[EN] Teaching vocabulary in semantically related sets is common practice among EFL teachers. The present study tests the effectiveness of this method by comparing it to the alternative technique: presenting vocabulary in an unrelated way. In the study two intact classes of Spanish learners of English in high-school were presented with a set of unrelated and related words and were then asked to complete a post-test to measure the impact of both techniques on learning. The results indicate that, while both techniques successfully help the learners to acquire new words, presenting words in unrelated sets seems to be more effective.