911 resultados para pacs: knowledge engineering tools
Resumo:
Several modern-day cooling applications require the incorporation of mini/micro-channel shear-driven flow condensers. There are several design challenges that need to be overcome in order to meet those requirements. The difficulty in developing effective design tools for shear-driven flow condensers is exacerbated due to the lack of a bridge between the physics-based modelling of condensing flows and the current, popular approach based on semi-empirical heat transfer correlations. One of the primary contributors of this disconnect is a lack of understanding caused by the fact that typical heat transfer correlations eliminate the dependence of the heat transfer coefficient on the method of cooling employed on the condenser surface when it may very well not be the case. This is in direct contrast to direct physics-based modeling approaches where the thermal boundary conditions have a direct and huge impact on the heat transfer coefficient values. Typical heat transfer correlations instead introduce vapor quality as one of the variables on which the value of the heat transfer coefficient depends. This study shows how, under certain conditions, a heat transfer correlation from direct physics-based modeling can be equivalent to typical engineering heat transfer correlations without making the same apriori assumptions. Another huge factor that raises doubts on the validity of the heat-transfer correlations is the opacity associated with the application of flow regime maps for internal condensing flows. It is well known that flow regimes influence heat transfer rates strongly. However, several heat transfer correlations ignore flow regimes entirely and present a single heat transfer correlation for all flow regimes. This is believed to be inaccurate since one would expect significant differences in the heat transfer correlations for different flow regimes. Several other studies present a heat transfer correlation for a particular flow regime - however, they ignore the method by which extents of the flow regime is established. This thesis provides a definitive answer (in the context of stratified/annular flows) to: (i) whether a heat transfer correlation can always be independent of the thermal boundary condition and represented as a function of vapor quality, and (ii) whether a heat transfer correlation can be independently obtained for a flow regime without knowing the flow regime boundary (even if the flow regime boundary is represented through a separate and independent correlation). To obtain the results required to arrive at an answer to these questions, this study uses two numerical simulation tools - the approximate but highly efficient Quasi-1D simulation tool and the exact but more expensive 2D Steady Simulation tool. Using these tools and the approximate values of flow regime transitions, a deeper understanding of the current state of knowledge in flow regime maps and heat transfer correlations in shear-driven internal condensing flows is obtained. The ideas presented here can be extended for other flow regimes of shear-driven flows as well. Analogous correlations can also be obtained for internal condensers in the gravity-driven and mixed-driven configuration.
Resumo:
My dissertation emphasizes a cognitive account of multimodality that explicitly integrates experiential knowledge work into the rhetorical pedagogy that informs so many composition and technical communication programs. In these disciplines, multimodality is widely conceived in terms of what Gunther Kress calls “socialsemiotic” modes of communication shaped primarily by culture. In the cognitive and neurolinguistic theories of Vittorio Gallese and George Lakoff, however, multimodality is described as a key characteristic of our bodies’ sensory-motor systems which link perception to action and action to meaning, grounding all communicative acts in knowledge shaped through body-engaged experience. I argue that this “situated” account of cognition – which closely approximates Maurice Merleau-Ponty’s phenomenology of perception, a major framework for my study – has pedagogical precedence in the mimetic pedagogy that informed ancient Sophistic rhetorical training, and I reveal that training’s multimodal dimensions through a phenomenological exegesis of the concept mimesis. Plato’s denigration of the mimetic tradition and his elevation of conceptual contemplation through reason, out of which developed the classic Cartesian separation of mind from body, resulted in a general degradation of experiential knowledge in Western education. But with the recent introduction into college classrooms of digital technologies and multimedia communication tools, renewed emphasis is being placed on the “hands-on” nature of inventive and productive praxis, necessitating a revision of methods of instruction and assessment that have traditionally privileged the acquisition of conceptual over experiential knowledge. The model of multimodality I construct from Merleau-Ponty’s phenomenology, ancient Sophistic rhetorical pedagogy, and current neuroscientific accounts of situated cognition insists on recognizing the significant role knowledges we acquire experientially play in our reading and writing, speaking and listening, discerning and designing practices.
Resumo:
A knowledge management tool developed by the GIS Center for to support project reporting tools, project publications, and a project data portal for materials related to the WAWASH Program.
Resumo:
We use an augmented version of the UK Innovation Surveys 4–7 to explore firm-level and local area openness externalities on firms’ innovation performance. We find strong evidence of the value of external knowledge acquisition both through interactive collaboration and non-interactive contacts such as demonstration effects, copying or reverse engineering. Levels of knowledge search activity remain well below the private optimum, however, due perhaps to informational market failures. We also find strong positive externalities of openness resulting from the intensity of local interactive knowledge search—a knowledge diffusion effect. However, there are strong negative externalities resulting from the intensity of local non-interactive knowledge search—a competition effect. Our results provide support for local initiatives to support innovation partnering and counter illegal copying or counterfeiting. We find no significant relationship between either local labour quality or employment composition and innovative outputs.
Resumo:
Softeam has over 20 years of experience providing UML-based modelling solutions, such as its Modelio modelling tool, and its Constellation enterprise model management and collaboration environment. Due to the increasing number and size of the models used by Softeam’s clients, Softeam joined the MONDO FP7 EU research project, which worked on solutions for these scalability challenges and produced the Hawk model indexer among other results. This paper presents the technical details and several case studies on the integration of Hawk into Softeam’s toolset. The first case study measured the performance of Hawk’s Modelio support using varying amounts of memory for the Neo4j backend. In another case study, Hawk was integrated into Constellation to provide scalable global querying of model repositories. Finally, the combination of Hawk and the Epsilon Generation Language was compared against Modelio for document generation: for the largest model, Hawk was two orders of magnitude faster.
Resumo:
Abstract Introduction: Knowledge provides the foundation for values, attitudes and behavior. Knowledge about sexual and reproductive health (SRH) and positive attitudes are essential for implementing protective behaviors. Objectives: The aim of this study was to evaluate SRH knowledge and attitudes in college students and their association with sexual and reproductive behaviors. Material and methods: A cross-sectional study was conducted in a sample of 1946 college students. The data were collected using a self-report questionnaire on the sociodemographics characteristics of the sample, an inventory on SRH knowledge and an attitude scale, and were analyzed with descriptive and inferential statistics (ANOVA and Pearson’s correlation). Results: The sample was 64% female and 36% male, with a mean age of 21 years. The majority were sexually active and used contraception. The SRH knowledge was moderate (22.27 ± 5.79; maximum score = 44), while the average SRH attitude score was more favorable (118.29 ± 13.92; maximum score = 140). Female and younger students studying life and health sciences had higher (P < .05) SRH knowledge and attitude scores. The consistent use of condom and health care surveillance were highly dependent on the students’ SRH knowledge and attitudes. Engagement in sexual risk behaviors was associated with lower scores for these variables. Conclusions: Strategies to increase SRH knowledge and attitudes are important tools for improving protective behaviors, especially with respect to contraception, health care surveillance and exposure to sexual risk. Older males studying topics other than life sciences should be a priority target for interventions due to their higher sexual risk
Resumo:
Technology has an important role in children's lives and education. Based on several projects developed with ICT, both in Early Childhood Education (3-6 years old) and Primary Education (6-10 years old), since 1997, the authors argue that research and educational practices need to "go outside", addressing ways to connect technology with outdoor education. The experience with the projects and initiatives developed supported a conceptual framework, developed and discussed with several partners throughout the years and theoretically informed. Three main principles or axis have emerged: strengthening Children's Participation, promoting Critical Citizenship and establishing strong Connections to Pedagogy and Curriculum. In this paper, those axis will be presented and discussed in relation to the challenge posed by Outdoor Education to the way ICT in Early Childhood and Primary Education is understood, promoted and researched. The paper is exploratory, attempting to connect theoretical and conceptual contributions from Early Childhood Pedagogy with contributions from ICT in Education. The research-based knowledge available is still scarce, mostly based on studies developed with other purposes. The paper, therefore, focus the connections and interpellations between concepts established through the theoretical framework and draws on the almost 20 years of experience with large and small scale action-research projects of ICT in schools. The more recent one is already testing the conceptual framework by supporting children in non-formal contexts to explore vineyards and the cycle of wine production with several ICT tools. Approaching Outdoor Education as an arena where pedagogical and cultural dimensions influence decisions and practices, the paper tries to argue that the three axis are relevant in supporting a stronger connection between technology and the outdoor.
Resumo:
Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.
Resumo:
In a globalized economy, the use of natural resources is determined by the demand of modern production and consumption systems, and by infrastructure development. Sustainable natural resource use will require good governance and management based on sound scientific information, data and indicators. There is a rich literature on natural resource management, yet the national and global scale and macro-economic policy making has been underrepresented. We provide an overview of the scholarly literature on multi-scale governance of natural resources, focusing on the information required by relevant actors from local to global scale. Global natural resource use is largely determined by national, regional, and local policies. We observe that in recent decades, the development of public policies of natural resource use has been fostered by an “inspiration cycle” between the research, policy and statistics community, fostering social learning. Effective natural resource policies require adequate monitoring tools, in particular indicators for the use of materials, energy, land, and water as well as waste and GHG emissions of national economies. We summarize the state-of-the-art of the application of accounting methods and data sources for national material flow accounts and indicators, including territorial and product-life-cycle based approaches. We show how accounts on natural resource use can inform the Sustainable Development Goals (SDGs) and argue that information on natural resource use, and in particular footprint indicators, will be indispensable for a consistent implementation of the SDGs. We recognize that improving the knowledge base for global natural resource use will require further institutional development including at national and international levels, for which we outline options.
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.
Resumo:
The work of knowledge organization requires a particular set of tools. For instance we need standards of content description like Anglo-American Cataloging Rules Edition 2, Resource Description and Access (RDA), Cataloging Cultural Objects, and Describing Archives: A Content Standard. When we intellectualize the process of knowledge organization – that is when we do basic theoretical research in knowledge organization we need another set of tools. For this latter exercise we need constructs. Constructs are ideas with many conceptual elements, largely considered subjective. They allow us to be inventive as well as allow us to see a particular point of view in knowledge organization. For example, Patrick Wilson’s ideas of exploitative control and descriptive control, or S. R. Ranganathan’s fundamental categories are constructs. They allow us to identify functional requirements or operationalizations of functional requirements, or at least come close to them for our systems and schemes. They also allow us to carry out meaningful evaluation.What is even more interesting, from a research point of view, is that constructs once offered to the community can be contested and reinterpreted and this has an affect on how we view knowledge organization systems and processes. Fundamental categories are again a good example in that some members of the Classification Research Group (CRG) argued against Ranganathan’s point of view. The CRG posited more fundamental categories than Ranganathan’s five, Personality, Matter, Energy, Space, and Time (Ranganathan, 1967). The CRG needed significantly more fundamental categories for their work.1 And these are just two voices in this space we can also consider the fundamental categories of Johannes Kaiser (1911), Shera and Egan, Barbara Kyle (Vickery, 1960), and Eric de Grolier (1962). We can also reference contemporary work that continues comparison and analysis of fundamental categories (e.g., Dousa, 2011).In all these cases we are discussing a construct. The fundamental category is not discovered; it is constructed by a classificationist. This is done because it is useful in engaging in the act of classification. And while we are accustomed to using constructs or debating their merit in one knowledge organization activity or another, we have not analyzed their structure, nor have we created a typology. In an effort to probe the epistemological dimension of knowledge organization, we think it would be a fruitful exercise to do this. This is because we might benefit from clarity around not only our terminology, but the manner in which we talk about our terminology. We are all creative workers examining what is available to us, but doing so through particular lenses (constructs) identifying particular constructs. And by knowing these and being able to refer to these we would consider a core competency for knowledge organization researchers.
Resumo:
This paper proposes a preliminary classification of knowledge organization research, divided among epistemology, theory, and methodology plus three spheres of research: design, study, and critique. This work is situated in a metatheoretical framework, drawn from sociological thought. Example works are presented along with preliminary classification. The classification is then briefly described as a comparison tool which can be used to demonstrate overlap and divergence in cognate discourses of knowledge organization (such as ontology engineering).
Resumo:
Las teorías administrativas se han basado, casi sin excepción, en los fundamentos y los modelos de la ciencia clásica (particularmente, en los modelos de la física newtoniana). Sin embargo, las organizaciones actualmente se enfrentan a un mundo globalizado, plagado de información (y no necesariamente conocimiento), hiperconectado, dinámico y cargado de incertidumbre, por lo que muchas de las teorías pueden mostrar limitaciones para las organizaciones. Y quizá no por la estructura, la lógica o el alcance de las mismas, sino por la falta de criterios que justifiquen su aplicación. En muchos casos, las organizaciones siguen utilizando la intuición, las suposiciones y las verdades a medias en la toma de decisiones. Este panorama pone de manifiesto dos hechos: de un lado, la necesidad de buscar un método que permita comprender las situaciones de cada organización para apoyar la toma de decisiones. De otro lado, la necesidad de potenciar la intuición con modelos y técnicas no tradicionales (usualmente provenientes o inspiradas por la ingeniería). Este trabajo busca anticipar los pilares de un posible método que permita apoyar la toma de decisiones por medio de la simulación de modelos computacionales, utilizando las posibles interacciones entre: la administración basada en modelos, la ciencia computacional de la organización y la ingeniería emergente.
Resumo:
Este trabajo se inscribe en uno de los grandes campos de los estudios organizacionales: la estrategia. La perspectiva clásica en este campo promovió la idea de que proyectarse hacia el futuro implica diseñar un plan (una serie de acciones deliberadas). Avances posteriores mostraron que la estrategia podía ser comprendida de otras formas. Sin embargo, la evolución del campo privilegió en alguna medida la mirada clásica estableciendo, por ejemplo, múltiples modelos para ‘formular’ una estrategia, pero dejando en segundo lugar la manera en la que esta puede ‘emerger’. El propósito de esta investigación es, entonces, aportar al actual nivel de comprensión respecto a las estrategias emergentes en las organizaciones. Para hacerlo, se consideró un concepto opuesto —aunque complementario— al de ‘planeación’ y, de hecho, muy cercano en su naturaleza a ese tipo de estrategias: la improvisación. Dado que este se ha nutrido de valiosos aportes del mundo de la música, se acudió al saber propio de este dominio, recurriendo al uso de ‘la metáfora’ como recurso teórico para entenderlo y alcanzar el objetivo propuesto. Los resultados muestran que 1) las estrategias deliberadas y las emergentes coexisten y se complementan, 2) la improvisación está siempre presente en el contexto organizacional, 3) existe una mayor intensidad de la improvisación en el ‘como’ de la estrategia que en el ‘qué’ y, en oposición a la idea convencional al respecto, 4) se requiere cierta preparación para poder improvisar de manera adecuada.
Resumo:
Biomarkers are nowadays essential tools to be one step ahead for fighting disease, enabling an enhanced focus on disease prevention and on the probability of its occurrence. Research in a multidisciplinary approach has been an important step towards the repeated discovery of new biomarkers. Biomarkers are defined as biochemical measurable indicators of the presence of disease or as indicators for monitoring disease progression. Currently, biomarkers have been used in several domains such as oncology, neurology, cardiovascular, inflammatory and respiratory disease, and several endocrinopathies. Bridging biomarkers in a One Health perspective has been proven useful in almost all of these domains. In oncology, humans and animals are found to be subject to the same environmental and genetic predisposing factors: examples include the existence of mutations in BR-CA1 gene predisposing to breast cancer, both in human and dogs, with increased prevalence in certain dog breeds and human ethnic groups. Also, breast feeding frequency and duration has been related to a decreased risk of breast cancer in women and bitches. When it comes to infectious diseases, this parallelism is prone to be even more important, for as much as 75% of all emerging diseases are believed to be zoonotic. Examples of successful use of biomarkers have been found in several zoonotic diseases such as Ebola, dengue, leptospirosis or West Nile virus infections. Acute Phase Proteins (APPs) have been used for quite some time as biomarkers of inflammatory conditions. These have been used in human health but also in the veterinary field such as in mastitis evaluation and PRRS (porcine respiratory and reproductive syndrome) diagnosis. Advantages rely on the fact that these biomarkers can be much easier to assess than other conventional disease diagnostic approaches (example: measured in easy to collect saliva samples). Another domain in which biomarkers have been essential is food safety: the possibility to measure exposure to chemical contaminants or other biohazards present in the food chain, which are sometimes analytical challenges due to their low bioavailability in body fluids, is nowadays a major breakthrough. Finally, biomarkers are considered the key to provide more personalized therapies, with more efficient outcomes and fewer side effects. This approach is expected to be the correct path to follow also in veterinary medicine, in the near future.