984 resultados para knowledge modeling
Resumo:
The inverse controller is traditionally assumed to be a deterministic function. This paper presents a pedagogical methodology for estimating the stochastic model of the inverse controller. The proposed method is based on Bayes' theorem. Using Bayes' rule to obtain the stochastic model of the inverse controller allows the use of knowledge of uncertainty from both the inverse and the forward model in estimating the optimal control signal. The paper presents the methodology for general nonlinear systems and is demonstrated on nonlinear single-input-single-output (SISO) and multiple-input-multiple-output (MIMO) examples. © 2006 IEEE.
Resumo:
The inverse controller is traditionally assumed to be a deterministic function. This paper presents a pedagogical methodology for estimating the stochastic model of the inverse controller. The proposed method is based on Bayes' theorem. Using Bayes' rule to obtain the stochastic model of the inverse controller allows the use of knowledge of uncertainty from both the inverse and the forward model in estimating the optimal control signal. The paper presents the methodology for general nonlinear systems. For illustration purposes, the proposed methodology is applied to linear Gaussian systems. © 2004 IEEE.
Resumo:
In an attempt to answer the need of wider accessibility and popularization of the treasury of Bulgarian folklore, a team from the Institute of Mathematics and Informatics at the Bulgarian Academy of Sciences has planned to develop the Bulgarian folklore artery within the national project ―Knowledge Technologies for Creation of Digital Presentation and Significant Repositories of Folklore Heritage‖. This paper presents the process of business modeling of the application architecture of the Bulgarian folklore artery, which aids requirements analysis, application design and its software implementation. The folklore domain process model is made in the context of the target social applications—e-learning, virtual expositions of folklore artifacts, research, news, cultural/ethno-tourism, etc. The basic processes are analyzed and modeled and some inferences are made for the use cases and requirements specification of the Bulgarian folklore artery application. As a conclusion the application architecture of the Bulgarian folklore artery is presented.
Resumo:
Within the framework of heritage preservation, 3D scanning and modeling for heritage documentation has increased significantly in recent years, mainly due to the evolution of laser and image-based techniques, modeling software, powerful computers and virtual reality. 3D laser acquisition constitutes a real development opportunity for 3D modeling based previously on theoretical data. The representation of the object information rely on the knowledge of its historic and theoretical frame to reconstitute a posteriori its previous states. This project proposes an approach dealing with data extraction based on architectural knowledge and Laser statement informing measurements, the whole leading to 3D reconstruction. The experimented Khmer objects are exposed at Guimet museum in Paris. The purpose of this digital modeling meets the need of exploitable models for simulation projects, prototyping, exhibitions, promoting cultural tourism and particularly for archiving against any likely disaster and as an aided tool for the formulation of virtual museum concept.
Resumo:
Specification of the non-functional requirements of applications and determining the required resources for their execution are activities that demand a great deal of technical knowledge, frequently resulting in an inefficient use of resources. Cloud computing is an alternative for provisioning of resources, which can be done using either the provider's own infrastructure or the infrastructure of one or more public clouds, or even a combination of both. It enables more flexibly/elastic use of resources, but does not solve the specification problem. In this paper we present an approach that uses models at runtime to facilitate the specification of non-functional requirements and resources, aiming to facilitate dynamic support for application execution in cloud computing environments with shared resources. © 2013 IEEE.
Resumo:
Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite's Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.
Resumo:
Bayesian nonparametric models, such as the Gaussian process and the Dirichlet process, have been extensively applied for target kinematics modeling in various applications including environmental monitoring, traffic planning, endangered species tracking, dynamic scene analysis, autonomous robot navigation, and human motion modeling. As shown by these successful applications, Bayesian nonparametric models are able to adjust their complexities adaptively from data as necessary, and are resistant to overfitting or underfitting. However, most existing works assume that the sensor measurements used to learn the Bayesian nonparametric target kinematics models are obtained a priori or that the target kinematics can be measured by the sensor at any given time throughout the task. Little work has been done for controlling the sensor with bounded field of view to obtain measurements of mobile targets that are most informative for reducing the uncertainty of the Bayesian nonparametric models. To present the systematic sensor planning approach to leaning Bayesian nonparametric models, the Gaussian process target kinematics model is introduced at first, which is capable of describing time-invariant spatial phenomena, such as ocean currents, temperature distributions and wind velocity fields. The Dirichlet process-Gaussian process target kinematics model is subsequently discussed for modeling mixture of mobile targets, such as pedestrian motion patterns.
Novel information theoretic functions are developed for these introduced Bayesian nonparametric target kinematics models to represent the expected utility of measurements as a function of sensor control inputs and random environmental variables. A Gaussian process expected Kullback Leibler divergence is developed as the expectation of the KL divergence between the current (prior) and posterior Gaussian process target kinematics models with respect to the future measurements. Then, this approach is extended to develop a new information value function that can be used to estimate target kinematics described by a Dirichlet process-Gaussian process mixture model. A theorem is proposed that shows the novel information theoretic functions are bounded. Based on this theorem, efficient estimators of the new information theoretic functions are designed, which are proved to be unbiased with the variance of the resultant approximation error decreasing linearly as the number of samples increases. Computational complexities for optimizing the novel information theoretic functions under sensor dynamics constraints are studied, and are proved to be NP-hard. A cumulative lower bound is then proposed to reduce the computational complexity to polynomial time.
Three sensor planning algorithms are developed according to the assumptions on the target kinematics and the sensor dynamics. For problems where the control space of the sensor is discrete, a greedy algorithm is proposed. The efficiency of the greedy algorithm is demonstrated by a numerical experiment with data of ocean currents obtained by moored buoys. A sweep line algorithm is developed for applications where the sensor control space is continuous and unconstrained. Synthetic simulations as well as physical experiments with ground robots and a surveillance camera are conducted to evaluate the performance of the sweep line algorithm. Moreover, a lexicographic algorithm is designed based on the cumulative lower bound of the novel information theoretic functions, for the scenario where the sensor dynamics are constrained. Numerical experiments with real data collected from indoor pedestrians by a commercial pan-tilt camera are performed to examine the lexicographic algorithm. Results from both the numerical simulations and the physical experiments show that the three sensor planning algorithms proposed in this dissertation based on the novel information theoretic functions are superior at learning the target kinematics with
little or no prior knowledge
Resumo:
Current research shows a relationship between healthcare architecture and patient-related Outcomes. The planning and designing of new healthcare environments is a complex process; the needs of the various end-users of the environment must be considered, including the patients, the patients’ significant others, and the staff. The aim of this study was to explore the experiences of healthcare professionals participating in group modelling utilizing system dynamics in the pre-design phase of new healthcare environments. We engaged healthcare professionals in a series of workshops using system dynamics to discuss the planning of healthcare environments in the beginning of a construction, and then interviewed them about their experience. An explorative and qualitative design was used to describe participants’ experiences of participating in the group modelling projects. Participants (n=20) were recruited from a larger intervention study using group modeling and system dynamics in planning and designing projects. The interviews were analysed by qualitative content analysis. Two themes were formed, representing the experiences in the group modeling process: ‘Partaking in the G-M created knowledge and empowerment’and ‘Partaking in the G-M was different from what was expected and required time and skills’. The method can support participants in design teams to focus more on their healthcare organization, their care activities and their aims rather than focusing on detailed layout solutions. This clarification is important when decisions about the design are discussed and prepared and will most likely lead to greater readiness for future building process.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
One of the biggest challenges that contaminant hydrogeology is facing, is how to adequately address the uncertainty associated with model predictions. Uncertainty arise from multiple sources, such as: interpretative error, calibration accuracy, parameter sensitivity and variability. This critical issue needs to be properly addressed in order to support environmental decision-making processes. In this study, we perform Global Sensitivity Analysis (GSA) on a contaminant transport model for the assessment of hydrocarbon concentration in groundwater. We provide a quantification of the environmental impact and, given the incomplete knowledge of hydrogeological parameters, we evaluate which are the most influential, requiring greater accuracy in the calibration process. Parameters are treated as random variables and a variance-based GSA is performed in a optimized numerical Monte Carlo framework. The Sobol indices are adopted as sensitivity measures and they are computed by employing meta-models to characterize the migration process, while reducing the computational cost of the analysis. The proposed methodology allows us to: extend the number of Monte Carlo iterations, identify the influence of uncertain parameters and lead to considerable saving computational time obtaining an acceptable accuracy.
Resumo:
Hydroxyl radical (OH) is the primary oxidant in the troposphere, initiating the removal of numerous atmospheric species including greenhouse gases, pollutants that are detrimental to human health, and ozone-depleting substances. Because of the complexity of OH chemistry, models vary widely in their OH chemistry schemes and resulting methane (CH4) lifetimes. The current state of knowledge concerning global OH abundances is often contradictory. This body of work encompasses three projects that investigate tropospheric OH from a modeling perspective, with the goal of improving the tropospheric community’s knowledge of the atmospheric lifetime of CH4. First, measurements taken during the airborne CONvective TRansport of Active Species in the Tropics (CONTRAST) field campaign are used to evaluate OH in global models. A box model constrained to measured variables is utilized to infer concentrations of OH along the flight track. Results are used to evaluate global model performance, suggest against the existence of a proposed “OH Hole” in the tropical Western Pacific, and investigate implications of high O3/low H2O filaments on chemical transport to the stratosphere. While methyl chloroform-based estimates of global mean OH suggest that models are overestimating OH, we report evidence that these models are actually underestimating OH in the tropical Western Pacific. The second project examines OH within global models to diagnose differences in CH4 lifetime. I developed an approach to quantify the roles of OH precursor field differences (O3, H2O, CO, NOx, etc.) using a neural network method. This technique enables us to approximate the change in CH4 lifetime resulting from variations in individual precursor fields. The dominant factors driving CH4 lifetime differences between models are O3, CO, and J(O3-O1D). My third project evaluates the effect of climate change on global fields of OH using an empirical model. Observations of H2O and O3 from satellite instruments are combined with a simulation of tropical expansion to derive changes in global mean OH over the past 25 years. We find that increasing H2O and increasing width of the tropics tend to increase global mean OH, countering the increasing CH4 sink and resulting in well-buffered global tropospheric OH concentrations.
Resumo:
Enterprise architecture (EA) is a tool that aligns organization’s business-process with application and information technology (IT) through EAmodels. This EA model allows the organization to cut off unnecessary IT expenses and determines the future and current IT requirements and boosts organizational performance. Enterprise architecture may be employed in every firm where the firm or organization requires configurations between information technology and business functions. This research investigates the role of enterprise architecture in healthcare organizations and suggests the suitable EA framework for knowledge-based medical diagnostic system for EA modeling by comparing the two most widely used EA frameworks. The results of the comparison identified that the proposed EA has a better framework for knowledge-based medical diagnostic system.
Resumo:
We present a new technical simulator for the eLISA mission, based on state space modeling techniques and developed in MATLAB. This simulator computes the coordinate and velocity over time of each body involved in the constellation, i.e. the spacecraft and its test masses, taking into account the different disturbances and actuations. This allows studying the contribution of instrumental noises and system imperfections on the residual acceleration applied on the TMs, the latter reflecting the performance of the achieved free-fall along the sensitive axis. A preliminary version of the results is presented.
Resumo:
The main purpose of the current study was to examine the role of vocabulary knowledge (VK) and syntactic knowledge (SK) in L2 listening comprehension, as well as their relative significance. Unlike previous studies, the current project employed assessment tasks to measure aural and proceduralized VK and SK. In terms of VK, to avoid under-representing the construct, measures of both breadth (VB) and depth (VD) were included. Additionally, the current study examined the role of VK and SK by accounting for individual differences in two important cognitive factors in L2 listening: metacognitive knowledge (MK) and working memory (WM). Also, to explore the role of VK and SK more fully, the current study accounted for the negative impact of anxiety on WM and L2 listening. The study was carried out in an English as a Foreign Language (EFL) context, and participants were 263 Iranian learners at a wide range of English proficiency from lower-intermediate to advanced. Participants took a battery of ten linguistic, cognitive and affective measures. Then, the collected data were subjected to several preliminary analyses, but structural equation modeling (SEM) was then used as the primary analysis method to answer the study research questions. Results of the preliminary analyses revealed that MK and WM were significant predictors of L2 listening ability; thus, they were kept in the main SEM analyses. The significant role of WM was only observed when the negative effect of anxiety on WM was accounted for. Preliminary analyses also showed that VB and VD were not distinct measures of VK. However, the results also showed that if VB and VD were considered separate, VD was a better predictor of L2 listening success. The main analyses of the current study revealed a significant role for both VK and SK in explaining success in L2 listening comprehension, which differs from findings from previous empirical studies. However, SEM analysis did not reveal a statistically significant difference in terms of the predictive power of the two linguistic factors. Descriptive results of the SEM analysis, along with results from regression analysis, indicated to a more significant role for VK.
Resumo:
Knowledge is one of the most important assets for surviving in the modern business environment. The effective management of that asset mandates continuous adaptation by organizations, and requires employees to strive to improve the company's work processes. Organizations attempt to coordinate their unique knowledge with traditional means as well as in new and distinct ways, and to transform them into innovative resources better than those of their competitors. As a result, how to manage the knowledge asset has become a critical issue for modern organizations, and knowledge management is considered the most feasible solution. Knowledge management is a multidimensional process that identifies, acquires, develops, distributes, utilizes, and stores knowledge. However, many related studies focus only on fragmented or limited knowledge-management perspectives. In order to make knowledge management more effective, it is important to identify the qualitative and quantitative issues that are the foundation of the challenge of effective knowledge management in organizations. The main purpose of this study was to integrate the fragmented knowledge management perspectives into the holistic framework, which includes knowledge infrastructure capability (technology, structure, and culture) and knowledge process capability (acquisition, conversion, application, and protection), based on Gold's (2001) study. Additionally, because the effect of incentives ̶̶ which is widely acknowledged as a prime motivator in facilitating the knowledge management process ̶̶ was missing in the original framework, this study included the importance of incentives in the knowledge management framework. This study also identified the relationship of organizational performance from the standpoint of the Balanced Scorecard, which includes the customer-related, internal business process, learning & growth, and perceptual financial aspects of organizational performance in the Korean business context. Moreover, this study identified the relationship with the objective financial performance by calculating the Tobin's q ratio. Lastly, this study compared the group differences between larger and smaller organizations, and manufacturing and nonmanufacturing firms in the study of knowledge management. Since this study was conducted in Korea, the original instrument was translated into Korean through the back translation technique. A confirmatory factor analysis (CFA) was used to examine the validity and reliability of the instrument. To identify the relationship between knowledge management capabilities and organizational performance, structural equation modeling (SEM) and multiple regression analysis were conducted. A Student's t test was conducted to examine the mean differences. The results of this study indicated that there is a positive relationship between effective knowledge management and organizational performance. However, no empirical evidence was found to suggest that knowledge management capabilities are linked to the objective financial performance, which remains a topic for future review. Additionally, findings showed that knowledge management is affected by organization's size, but not by type of organization. The results of this study are valuable in establishing a valid and reliable survey instrument, as well as in providing strong evidence that knowledge management capabilities are essential to improving organizational performance currently and making important recommendations for future research.