948 resultados para Model driven developments
Resumo:
Microturbines are among the most successfully commercialized distributed energy resources, especially when they are used for combined heat and power generation. However, the interrelated thermal and electrical system dynamic behaviors have not been fully investigated. This is technically challenging due to the complex thermo-fluid-mechanical energy conversion processes which introduce multiple time-scale dynamics and strong nonlinearity into the analysis. To tackle this problem, this paper proposes a simplified model which can predict the coupled thermal and electric output dynamics of microturbines. Considering the time-scale difference of various dynamic processes occuring within microturbines, the electromechanical subsystem is treated as a fast quasi-linear process while the thermo-mechanical subsystem is treated as a slow process with high nonlinearity. A three-stage subspace identification method is utilized to capture the dominant dynamics and predict the electric power output. For the thermo-mechanical process, a radial basis function model trained by the particle swarm optimization method is employed to handle the strong nonlinear characteristics. Experimental tests on a Capstone C30 microturbine show that the proposed modeling method can well capture the system dynamics and produce a good prediction of the coupled thermal and electric outputs in various operating modes.
Resumo:
Microturbines are among the most successfully commercialized distributed energy resources, especially when they are used for combined heat and power generation. However, the interrelated thermal and electrical system dynamic behaviors have not been fully investigated. This is technically challenging due to the complex thermo-fluid-mechanical energy conversion processes which introduce multiple time-scale dynamics and strong nonlinearity into the analysis. To tackle this problem, this paper proposes a simplified model which can predict the coupled thermal and electric output dynamics of microturbines. Considering the time-scale difference of various dynamic processes occuring within microturbines, the electromechanical subsystem is treated as a fast quasi-linear process while the thermo-mechanical subsystem is treated as a slow process with high nonlinearity. A three-stage subspace identification method is utilized to capture the dominant dynamics and predict the electric power output. For the thermo-mechanical process, a radial basis function model trained by the particle swarm optimization method is employed to handle the strong nonlinear characteristics. Experimental tests on a Capstone C30 microturbine show that the proposed modeling method can well capture the system dynamics and produce a good prediction of the coupled thermal and electric outputs in various operating modes.
Resumo:
OSAN, R. , TORT, A. B. L. , AMARAL, O. B. . A mismatch-based model for memory reconsolidation and extinction in attractor networks. Plos One, v. 6, p. e23113, 2011.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Research in ubiquitous and pervasive technologies have made it possible to recognise activities of daily living through non-intrusive sensors. The data captured from these sensors are required to be classified using various machine learning or knowledge driven techniques to infer and recognise activities. The process of discovering the activities and activity-object patterns from the sensors tagged to objects as they are used is critical to recognising the activities. In this paper, we propose a topic model process of discovering activities and activity-object patterns from the interactions of low level state-change sensors. We also develop a recognition and segmentation algorithm to recognise activities and recognise activity boundaries. Experimental results we present validates our framework and shows it is comparable to existing approaches.
Resumo:
There are hundreds of millions of songs available to the public, necessitating the use of music recommendation systems to discover new music. Currently, such systems account for only the quantitative musical elements of songs, failing to consider aspects of human perception of music and alienating the listener’s individual preferences from recommendations. Our research investigated the relationships between perceptual elements of music, represented by the MUSIC model, with computational musical features generated through The Echo Nest, to determine how a psychological representation of music preference can be incorporated into recommendation systems to embody an individual’s music preferences. Our resultant model facilitates computation of MUSIC factors using The Echo Nest features, and can potentially be integrated into recommendation systems for improved performance.
Resumo:
The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological (= impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K-1, showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day.
Resumo:
Observations of Caspian Sea during August - September 1995 are used to develop a three dimensional numerical for calculating temperature and current. This period was chosen because of extensive set of observational data including surface temperature observations. Data from the meteorological buoy network on Caspian Sea are combined with routine observation at first order synoptic station around the lake to obtain hourly values of wind stress and pressure fields. Initial temperature distribution as a function of depth and horizontal coordinates are derived from ship cruises. The model has variable grid resolution and horizontal smoothing which filters out small scale vertical motion. The hydrodynamic model of Caspian Sea has 6 vertical levels and a uniform horizontal grid size of 50 km The model is driven with surface fluxes of heat and momentum derived from observed meteorological. The model was able to reproduce all of the basic feature of the thermal structure in Caspian sea and: larger scale circulation patterns tend to be cyclone, with cyclone circulation with each sub basin. Result has agreement with observations.
Resumo:
New developments in higher education and research are having their repercussions in dailylicencing practice. Examples are; demands for perpetual access usage of licensed content incourse packs or virtual research environments text mining open access to publications. Atthe Knowledge Exchange workshop on LicencingPractice, twenty Experts discussed how these newdevelopments could be incorporated in licencing. The workshop consisted of four presentations oncurrent developments in licencing followed by threeparallel breakout sessions on the topics open access,new developments and data and text mining. This led toa lively exchange of ideas. Especially the aspect of dataand text mining provided valuable insights in how thiscould be incorporated in licencing. The Knowledge Exchange Licensing expert group willwork on how to implement the model provisions discussed. Input from the workshop was collected for a workshop with publishers to take place in March 2012 and will include these provisions in their licences. The various suggestions will be also shared with other international organisations working inthis field.
Resumo:
Symbolic execution is a powerful program analysis technique, but it is very challenging to apply to programs built using event-driven frameworks, such as Android. The main reason is that the framework code itself is too complex to symbolically execute. The standard solution is to manually create a framework model that is simpler and more amenable to symbolic execution. However, developing and maintaining such a model by hand is difficult and error-prone. We claim that we can leverage program synthesis to introduce a high-degree of automation to the process of framework modeling. To support this thesis, we present three pieces of work. First, we introduced SymDroid, a symbolic executor for Android. While Android apps are written in Java, they are compiled to Dalvik bytecode format. Instead of analyzing an app’s Java source, which may not be available, or decompiling from Dalvik back to Java, which requires significant engineering effort and introduces yet another source of potential bugs in an analysis, SymDroid works directly on Dalvik bytecode. Second, we introduced Pasket, a new system that takes a first step toward automatically generating Java framework models to support symbolic execution. Pasket takes as input the framework API and tutorial programs that exercise the framework. From these artifacts and Pasket's internal knowledge of design patterns, Pasket synthesizes an executable framework model by instantiating design patterns, such that the behavior of a synthesized model on the tutorial programs matches that of the original framework. Lastly, in order to scale program synthesis to framework models, we devised adaptive concretization, a novel program synthesis algorithm that combines the best of the two major synthesis strategies: symbolic search, i.e., using SAT or SMT solvers, and explicit search, e.g., stochastic enumeration of possible solutions. Adaptive concretization parallelizes multiple sub-synthesis problems by partially concretizing highly influential unknowns in the original synthesis problem. Thanks to adaptive concretization, Pasket can generate a large-scale model, e.g., thousands lines of code. In addition, we have used an Android model synthesized by Pasket and found that the model is sufficient to allow SymDroid to execute a range of apps.
Resumo:
Modern software application testing, such as the testing of software driven by graphical user interfaces (GUIs) or leveraging event-driven architectures in general, requires paying careful attention to context. Model-based testing (MBT) approaches first acquire a model of an application, then use the model to construct test cases covering relevant contexts. A major shortcoming of state-of-the-art automated model-based testing is that many test cases proposed by the model are not actually executable. These \textit{infeasible} test cases threaten the integrity of the entire model-based suite, and any coverage of contexts the suite aims to provide. In this research, I develop and evaluate a novel approach for classifying the feasibility of test cases. I identify a set of pertinent features for the classifier, and develop novel methods for extracting these features from the outputs of MBT tools. I use a supervised logistic regression approach to obtain a model of test case feasibility from a randomly selected training suite of test cases. I evaluate this approach with a set of experiments. The outcomes of this investigation are as follows: I confirm that infeasibility is prevalent in MBT, even for test suites designed to cover a relatively small number of unique contexts. I confirm that the frequency of infeasibility varies widely across applications. I develop and train a binary classifier for feasibility with average overall error, false positive, and false negative rates under 5\%. I find that unique event IDs are key features of the feasibility classifier, while model-specific event types are not. I construct three types of features from the event IDs associated with test cases, and evaluate the relative effectiveness of each within the classifier. To support this study, I also develop a number of tools and infrastructure components for scalable execution of automated jobs, which use state-of-the-art container and continuous integration technologies to enable parallel test execution and the persistence of all experimental artifacts.
Resumo:
OSAN, R. , TORT, A. B. L. , AMARAL, O. B. . A mismatch-based model for memory reconsolidation and extinction in attractor networks. Plos One, v. 6, p. e23113, 2011.
Resumo:
Recent developments in automation, robotics and artificial intelligence have given a push to a wider usage of these technologies in recent years, and nowadays, driverless transport systems are already state-of-the-art on certain legs of transportation. This has given a push for the maritime industry to join the advancement. The case organisation, AAWA initiative, is a joint industry-academia research consortium with the objective of developing readiness for the first commercial autonomous solutions, exploiting state-of-the-art autonomous and remote technology. The initiative develops both autonomous and remote operation technology for navigation, machinery, and all on-board operating systems. The aim of this study is to develop a model with which to estimate and forecast the operational costs, and thus enable comparisons between manned and autonomous cargo vessels. The building process of the model is also described and discussed. Furthermore, the model’s aim is to track and identify the critical success factors of the chosen ship design, and to enable monitoring and tracking of the incurred operational costs as the life cycle of the vessel progresses. The study adopts the constructive research approach, as the aim is to develop a construct to meet the needs of a case organisation. Data has been collected through discussions and meeting with consortium members and researchers, as well as through written and internal communications material. The model itself is built using activity-based life cycle costing, which enables both realistic cost estimation and forecasting, as well as the identification of critical success factors due to the process-orientation adopted from activity-based costing and the statistical nature of Monte Carlo simulation techniques. As the model was able to meet the multiple aims set for it, and the case organisation was satisfied with it, it could be argued that activity-based life cycle costing is the method with which to conduct cost estimation and forecasting in the case of autonomous cargo vessels. The model was able to perform the cost analysis and forecasting, as well as to trace the critical success factors. Later on, it also enabled, albeit hypothetically, monitoring and tracking of the incurred costs. By collecting costs this way, it was argued that the activity-based LCC model is able facilitate learning from and continuous improvement of the autonomous vessel. As with the building process of the model, an individual approach was chosen, while still using the implementation and model building steps presented in existing literature. This was due to two factors: the nature of the model and – perhaps even more importantly – the nature of the case organisation. Furthermore, the loosely organised network structure means that knowing the case organisation and its aims is of great importance when conducting a constructive research.
Resumo:
Aim The spread of non-indigenous species in marine ecosystems world-wide is one of today's most serious environmental concerns. Using mechanistic modelling, we investigated how global change relates to the invasion of European coasts by a non-native marine invertebrate, the Pacific oyster Crassostrea gigas. Location Bourgneuf Bay on the French Atlantic coast was considered as the northern boundary of C. gigas expansion at the time of its introduction to Europe in the 1970s. From this latitudinal reference, variations in the spatial distribution of the C. gigas reproductive niche were analysed along the north-western European coast from Gibraltar to Norway. Methods The effects of environmental variations on C. gigas physiology and phenology were studied using a bioenergetics model based on Dynamic Energy Budget theory. The model was forced with environmental time series including in situ phytoplankton data, and satellite data of sea surface temperature and suspended particulate matter concentration. Results Simulation outputs were successfully validated against in situ oyster growth data. In Bourgneuf Bay, the rise in seawater temperature and phytoplankton concentration has increased C. gigas reproductive effort and led to precocious spawning periods since the 1960s. At the European scale, seawater temperature increase caused a drastic northward shift (1400 km within 30 years) in the C. gigas reproductive niche and optimal thermal conditions for early life stage development. Main conclusions We demonstrated that the poleward expansion of the invasive species C. gigas is related to global warming and increase in phytoplankton abundance. The combination of mechanistic bioenergetics modelling with in situ and satellite environmental data is a valuable framework for ecosystem studies. It offers a generic approach to analyse historical geographical shifts and to predict the biogeographical changes expected to occur in a climate-changing world.
Resumo:
Cellular models are important tools in various research areas related to colorectal biology and associated diseases. Herein, we review the most widely used cell lines and the different techniques to grow them, either as cell monolayer, polarized two-dimensional epithelia on membrane filters, or as three-dimensional spheres in scaffoldfree or matrix-supported culture conditions. Moreover, recent developments, such as gut-on-chip devices or the ex vivo growth of biopsy-derived organoids, are also discussed. We provide an overview on the potential applications but also on the limitations for each of these techniques, while evaluating their contribution to provide more reliable cellular models for research, diagnostic testing, or pharmacological validation related to colon physiology and pathophysiology.