973 resultados para Model Driven Engineering


Relevância:

80.00% 80.00%

Publicador:

Resumo:

El tema central de investigación en esta Tesis es el estudio del comportamientodinámico de una estructura mediante modelos que describen la distribución deenergía entre los componentes de la misma y la aplicación de estos modelos parala detección de daños incipientes.Los ensayos dinámicos son un modo de extraer información sobre las propiedadesde una estructura. Si tenemos un modelo de la estructura se podría ajustar éstepara que, con determinado grado de precisión, tenga la misma respuesta que elsistema real ensayado. Después de que se produjese un daño en la estructura,la respuesta al mismo ensayo variará en cierta medida; actualizando el modelo alas nuevas condiciones podemos detectar cambios en la configuración del modeloestructural que nos condujeran a la conclusión de que en la estructura se haproducido un daño.De este modo, la detección de un daño incipiente es posible si somos capacesde distinguir una pequeña variación en los parámetros que definen el modelo. Unrégimen muy apropiado para realizar este tipo de detección es a altas frecuencias,ya que la respuesta es muy dependiente de los pequeños detalles geométricos,dado que el tamaño característico en la estructura asociado a la respuesta esdirectamente proporcional a la velocidad de propagación de las ondas acústicas enel sólido, que para una estructura dada es inalterable, e inversamente proporcionala la frecuencia de la excitación. Al mismo tiempo, esta característica de la respuestaa altas frecuencias hace que un modelo de Elementos Finitos no sea aplicable enla práctica, debido al alto coste computacional.Un modelo ampliamente utilizado en el cálculo de la respuesta de estructurasa altas frecuencias en ingeniería es el SEA (Statistical Energy Analysis). El SEAaplica el balance energético a cada componente estructural, relacionando la energíade vibración de estos con la potencia disipada por cada uno de ellos y la potenciatransmitida entre ellos, cuya suma debe ser igual a la potencia inyectada a cadacomponente estructural. Esta relación es lineal y viene caracterizada por los factoresde pérdidas. Las magnitudes que intervienen en la respuesta se consideranpromediadas en la geometría, la frecuencia y el tiempo.Actualizar el modelo SEA a datos de ensayo es, por lo tanto, calcular losfactores de pérdidas que reproduzcan la respuesta obtenida en éste. Esta actualización,si se hace de manera directa, supone la resolución de un problema inversoque tiene la característica de estar mal condicionado. En la Tesis se propone actualizarel modelo SEA, no en término de los factores de pérdidas, sino en términos deparámetros estructurales que tienen sentido físico cuando se trata de la respuestaa altas frecuencias, como son los factores de disipación de cada componente, susdensidades modales y las rigideces características de los elementos de acoplamiento.Los factores de pérdidas se calculan como función de estos parámetros. Estaformulación es desarrollada de manera original en esta Tesis y principalmente sefunda en la hipótesis de alta densidad modal, es decir, que en la respuesta participanun gran número de modos de cada componente estructural.La teoría general del método SEA, establece que el modelo es válido bajounas hipótesis sobre la naturaleza de las excitaciones externas muy restrictivas,como que éstas deben ser de tipo ruido blanco local. Este tipo de carga es difícil dereproducir en condiciones de ensayo. En la Tesis mostramos con casos prácticos queesta restricción se puede relajar y, en particular, los resultados son suficientementebuenos cuando la estructura se somete a una carga armónica en escalón.Bajo estas aproximaciones se desarrolla un algoritmo de optimización por pasosque permite actualizar un modelo SEA a un ensayo transitorio cuando la carga esde tipo armónica en escalón. Este algoritmo actualiza el modelo no solamente parauna banda de frecuencia en particular sino para diversas bandas de frecuencia demanera simultánea, con el objetivo de plantear un problema mejor condicionado.Por último, se define un índice de daño que mide el cambio en la matriz depérdidas cuando se produce un daño estructural en una localización concreta deun componente. Se simula numéricamente la respuesta de una estructura formadapor vigas donde producimos un daño en la sección de una de ellas; como se tratade un cálculo a altas frecuencias, la simulación se hace mediante el Método delos Elementos Espectrales para lo que ha sido necesario desarrollar dentro de laTesis un elemento espectral de tipo viga dañada en una sección determinada. Losresultados obtenidos permiten localizar el componente estructural en que se haproducido el daño y la sección en que éste se encuentra con determinado grado deconfianza.AbstractThe main subject under research in this Thesis is the study of the dynamic behaviourof a structure using models that describe the energy distribution betweenthe components of the structure and the applicability of these models to incipientdamage detection.Dynamic tests are a way to extract information about the properties of astructure. If we have a model of the structure, it can be updated in order toreproduce the same response as in experimental tests, within a certain degree ofaccuracy. After damage occurs, the response will change to some extent; modelupdating to the new test conditions can help to detect changes in the structuralmodel leading to the conclusión that damage has occurred.In this way incipient damage detection is possible if we are able to detect srnallvariations in the model parameters. It turns out that the high frequency regimeis highly relevant for incipient damage detection, because the response is verysensitive to small structural geometric details. The characteristic length associatedwith the response is proportional to the propagation speed of acoustic waves insidethe solid, but inversely proportional to the excitation frequency. At the same time,this fact makes the application of a Finite Element Method impractical due to thehigh computational cost.A widely used model in engineering when dealing with the high frequencyresponse is SEA (Statistical Energy Analysis). SEA applies the energy balance toeach structural component, relating their vibrational energy with the dissipatedpower and the transmitted power between the different components; their summust be equal to the input power to each of them. This relationship is linear andcharacterized by loss factors. The magnitudes considered in the response shouldbe averaged in geometry, frequency and time.SEA model updating to test data is equivalent to calculating the loss factorsthat provide a better fit to the experimental response. This is formulated as an illconditionedinverse problem. In this Thesis a new updating algorithm is proposedfor the study of the high frequency response regime in terms of parameters withphysical meaning such as the internal dissipation factors, modal densities andcharacteristic coupling stiffness. The loss factors are then calculated from theseparameters. The approach is developed entirely in this Thesis and is mainlybased on a high modal density asumption, that is to say, a large number of modescontributes to the response.General SEA theory establishes the validity of the model under the asumptionof very restrictive external excitations. These should behave as a local white noise.This kind of excitation is difficult to reproduce in an experimental environment.In this Thesis we show that in practical cases this assumption can be relaxed, inparticular, results are good enough when the structure is excited with a harmonicstep function.Under these assumptions an optimization algorithm is developed for SEAmodel updating to a transient test when external loads are harmonic step functions.This algorithm considers the response not only in a single frequency band,but also for several of them simultaneously.A damage index is defined that measures the change in the loss factor matrixwhen a damage has occurred at a certain location in the structure. The structuresconsidered in this study are built with damaged beam elements; as we are dealingwith the high frequency response, the numerical simulation is implemented witha Spectral Element Method. It has therefore been necessary to develop a spectralbeam damaged element as well. The reported results show that damage detectionis possible with this algorithm, moreover, damage location is also possible withina certain degree of accuracy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Service-Oriented Architectures (SOA), and Web Services (WS), the technology generally used to implement them, achieve the integration of heterogeneous technologies, providing interoperability, and yielding the reutilization of pre-existent systems. Model-driven development methodologies provide inherent benefits such as increased productivity, greater reuse, and better maintainability, to name a few. Efforts on achieving model-driven development of SOAs already exist, but there is currently no standard solution that addresses non-functional aspects of these services as well. This paper presents an approach to integrate these non-functional aspects in the development of web services, with an emphasis on security.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

How to create or integrate large Smart Spaces (considered as mash-ups of sensors and actuators) into the paradigm of ?Web of Things? has been the motivation of many recent works. A cutting-edge approach deals with developing and deploying web-enabled embedded devices with two major objectives: 1) to integrate sensor and actuator technologies into everyday objects, and 2) to allow a diversity of devices to plug to Internet. Currently, developers who want to use this Internet-oriented approach need have solid understanding about sensorial platforms and semantic technologies. In this paper we propose a Resource-Oriented and Ontology-Driven Development (ROOD) methodology, based on Model Driven Architecture (MDA), to facilitate to any developer the development and deployment of Smart Spaces. Early evaluations of the ROOD methodology have been successfully accomplished through a partial deployment of a Smart Hotel.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Usability plays an important role to satisfy users? needs. There are many recommendations in the HCI literature on how to improve software usability. Our research focuses on such recommendations that affect the system architecture rather than just the interface. However, improving software usability in aspects that affect architecture increases the analyst?s workload and development complexity. This paper proposes a solution based on model-driven development. We propose representing functional usability mechanisms abstractly by means of conceptual primitives. The analyst will use these primitives to incorporate functional usability features at the early stages of the development process. Following the model-driven development paradigm, these features are then automatically transformed into subsequent steps of development, a practice that is hidden from the analyst.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context: Today’s project managers have a myriad of methods to choose from for the development of software applications. However, they lack empirical data about the character of these methods in terms of usefulness, ease of use or compatibility, all of these being relevant variables to assess the developer’s intention to use them. Objective: To compare three methods, each following a different paradigm (Model-Driven, Model-Based and Code-Centric) with respect to their adoption potential by junior software developers engaged in the development of the business layer of a Web 2.0 application. Method: We have conducted a quasi-experiment with 26 graduate students of the University of Alicante. The application developed was a Social Network, which was organized around a fixed set of modules. Three of them, similar in complexity, were used for the experiment. Subjects were asked to use a different method for each module, and then to answer a questionnaire that gathered their perceptions during such use. Results: The results show that the Model-Driven method is regarded as the most useful, although it is also considered the least compatible with previous developers’ experiences. They also show that junior software developers feel comfortable with the use of models, and that they are likely to use them if the models are accompanied by a Model-Driven development environment. Conclusions: Despite their relatively low level of compatibility, Model-Driven development methods seem to show a great potential for adoption. That said, however, further experimentation is needed to make it possible to generalize the results to a different population, different methods, other languages and tools, different domains or different application sizes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ingress of water and Kokubo simulated body fluid (SBF) into poly (2-hydroxyethyl methacrylate) (PHEMA), and its co-polymers with tetrahydrofurduryl methacrylate (THFMA), loaded with either one of two model drugs, vitamin 1312 or aspirin, was studied by mass uptake over the temperature range 298-318 K. The polymers were studied as cylinders and were loaded with either 5 wt% or 10 wt% of the drugs. From DSC studies it was observed that vitamin B-12 behaved as a physical cross-linker restricting chain segmental mobility, and so had a small anti-plasticisation effect on PHEMA and the co-polymers rich in HEMA, but almost no effect on the T-g of co-polymers rich in THFMA. On the other hand, aspirin exhibited a plasticising effect on PHEMA and the copolymers. All of the polymers were found to absorb water and SBF according to a Fickian diffusion mechanism. The polymers were all found to swell to a greater extent in SBF than in water, which was attributed to the presence of Tris buffer in the SBF. The sorptions of the two penetrants were found to follow Fickian kinetics in all cases and the diffusion coefficients at 310 K for SBF were found to be smaller than those for water, except for the polymers containing aspirin where the diffusion coefficients were higher than for the other systems. For example, for sorption into PHEMA the diffusion coefficient for water was 1.41 X 10(-11) m(2)/s and for SBF was 0.79 x 10-11 m(2)/s, but in the presence of 5 wt% aspirin the corresponding values were 1.27 x 10(-1)1 m(2)/s and 1.25 x 10(-11) m(2)/s, respectively. The corresponding values for PHEMA loaded with 5 wt% B-12 were 1.25 x 10(-11) m(2)/s and 0.74 x 10(-11) m(2)/s, respectively.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to explore clinician reactions to (i) the introduction of routine outcome measures and (ii) the utility of outcomes data in clinical practice. Focus group discussions (n = 34) were conducted with mental health staff (n = 324) at approximately 8 months post implementation of routine outcome measures. A semi-structured interview schedule was used to collect data on two key issues; reactions to the introduction of outcome measures and factors influencing the utility of outcomes data in clinical practice. Data from the discussion groups were analysed using content analysis to isolate emerging themes. While the majority of participants endorsed the collection and utilization of outcomes data, many raised questions about the merits of the initiative. Ambivalence, competing work demands, lack of support from senior medical staff, questionable evidence to support the use of outcome measures, and fear of how outcomes data might be used emerged as key issues. At 8 months post implementation a significant number of clinical staff remained ambivalent about the benefits of outcome measurement and had not engaged in the process. The shift to a service model driven by outcomes and case-mix data will take time and resources to achieve. Implications for nursing staff are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Meta-Object Facility (MOF) provides a standardized framework for object-oriented models. An instance of a MOF model contains objects and links whose interfaces are entirely derived from that model. Information contained in these objects can be accessed directly, however, in order to realize the Model-Driven Architecture@trade; (MDA), we must have a mechanism for representing and evaluating structured queries on these instances. The MOF Query Language (MQL) is a language that extends the UML's Object Constraint Language (OCL) to provide more expressive power, such as higher-order queries, parametric polymorphism and argument polymorphism. Not only do these features allow more powerful queries, but they also encourage a greater degree of modularization and re-use, resulting in faster prototyping and facilitating automated integrity analysis. This paper presents an overview of the motivations for developing MQL and also discusses its abstract syntax, presented as a MOF model, and its semantics

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers the problem of low-dimensional visualisation of very high dimensional information sources for the purpose of situation awareness in the maritime environment. In response to the requirement for human decision support aids to reduce information overload (and specifically, data amenable to inter-point relative similarity measures) appropriate to the below-water maritime domain, we are investigating a preliminary prototype topographic visualisation model. The focus of the current paper is on the mathematical problem of exploiting a relative dissimilarity representation of signals in a visual informatics mapping model, driven by real-world sonar systems. A realistic noise model is explored and incorporated into non-linear and topographic visualisation algorithms building on the approach of [9]. Concepts are illustrated using a real world dataset of 32 hydrophones monitoring a shallow-water environment in which targets are present and dynamic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The convergence of data, audio and video on IP networks is changing the way individuals, groups and organizations communicate. This diversity of communication media presents opportunities for creating synergistic collaborative communications. This form of collaborative communication is however not without its challenges. The increasing number of communication service providers coupled with a combinatorial mix of offered services, varying Quality-of-Service and oscillating pricing of services increases the complexity for the user to manage and maintain ‘always best’ priced or performance services. Consumers have to manually manage and adapt their communication in line with differences in services across devices, networks and media while ensuring that the usage remain consistent with their intended goals. This dissertation proposes a novel user-centric approach to address this problem. The proposed approach aims to reduce the aforementioned complexity to the user by (1) providing high-level abstractions and a policy based methodology for automated selection of the communication services guided by high-level user policies and (2) providing services through the seamless integration of multiple communication service providers and providing an extensible framework to support the integration of multiple communication service providers. The approach was implemented in the Communication Virtual Machine (CVM), a model-driven technology for realizing communication applications. The CVM includes the Network Communication Broker, the layer responsible for providing a network-independent API to the upper layers of CVM. The initial prototype for the NCB supported only a single communication framework which limited the number, quality and types of services available. Experimental evaluation of the approach show the additional overhead of the approach is minimal compared to the individual communication services frameworks. Additionally the automated approach proposed out performed the individual communication services frameworks for cross framework switching.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increasing use of model-driven software development has renewed emphasis on using domain-specific models during application development. More specifically, there has been emphasis on using domain-specific modeling languages (DSMLs) to capture user-specified requirements when creating applications. The current approach to realizing these applications is to translate DSML models into source code using several model-to-model and model-to-code transformations. This approach is still dependent on the underlying source code representation and only raises the level of abstraction during development. Experience has shown that developers will many times be required to manually modify the generated source code, which can be error-prone and time consuming. ^ An alternative to the aforementioned approach involves using an interpreted domain-specific modeling language (i-DSML) whose models can be directly executed using a Domain Specific Virtual Machine (DSVM). Direct execution of i-DSML models require a semantically rich platform that reduces the gap between the application models and the underlying services required to realize the application. One layer in this platform is the domain-specific middleware that is responsible for the management and delivery of services in the specific domain. ^ In this dissertation, we investigated the problem of designing the domain-specific middleware of the DSVM to facilitate the bifurcation of the semantics of the domain and the model of execution (MoE) while supporting runtime adaptation and validation. We approached our investigation by seeking solutions to the following sub-problems: (1) How can the domain-specific knowledge (DSK) semantics be separated from the MoE for a given domain? (2) How do we define a generic model of execution (GMoE) of the middleware so that it is adaptable and realizes DSK operations to support delivery of services? (3) How do we validate the realization of DSK operations at runtime? ^ Our research into the domain-specific middleware was done using an i-DSML for the user-centric communication domain, Communication Modeling Language (CML), and for microgrid energy management domain, Microgrid Modeling Language (MGridML). We have successfully developed a methodology to separate the DSK and GMoE of the middleware of a DSVM that supports specialization for a given domain, and is able to perform adaptation and validation at runtime. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long-span bridges are flexible and therefore are sensitive to wind induced effects. One way to improve the stability of long span bridges against flutter is to use cross-sections that involve twin side-by-side decks. However, this can amplify responses due to vortex induced oscillations. Wind tunnel testing is a well-established practice to evaluate the stability of bridges against wind loads. In order to study the response of the prototype in laboratory, dynamic similarity requirements should be satisfied. One of the parameters that is normally violated in wind tunnel testing is Reynolds number. In this dissertation, the effects of Reynolds number on the aerodynamics of a double deck bridge were evaluated by measuring fluctuating forces on a motionless sectional model of a bridge at different wind speeds representing different Reynolds regimes. Also, the efficacy of vortex mitigation devices was evaluated at different Reynolds number regimes. One other parameter that is frequently ignored in wind tunnel studies is the correct simulation of turbulence characteristics. Due to the difficulties in simulating flow with large turbulence length scale on a sectional model, wind tunnel tests are often performed in smooth flow as a conservative approach. The validity of simplifying assumptions in calculation of buffeting loads, as the direct impact of turbulence, needs to be verified for twin deck bridges. The effects of turbulence characteristics were investigated by testing sectional models of a twin deck bridge under two different turbulent flow conditions. Not only the flow properties play an important role on the aerodynamic response of the bridge, but also the geometry of the cross section shape is expected to have significant effects. In this dissertation, the effects of deck details, such as width of the gap between the twin decks, and traffic barriers on the aerodynamic characteristics of a twin deck bridge were investigated, particularly on the vortex shedding forces with the aim of clarifying how these shape details can alter the wind induced responses. Finally, a summary of the issues that are involved in designing a dynamic test rig for high Reynolds number tests is given, using the studied cross section as an example.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The technological evolution has been making the Distance Education accessible for a greater number of citizens anytime and anywhere. The potential increase of the supply for mobile devices integrated to mobile learning environments allows that the information comes out of the physical environment, creating opportunities for students and teachers to create geographically distributed learning scenarios. However, many applications developed for these environments remain isolated from each other and do not become integrated sufficiently into the virtual learning environments (AVA). This dissertation presents an interoperability model between mobile devices and distinct AVA based on webservices. For the conception of this model, requirements engineering and software architecture techniques were used. With the goal of showing the model viability, a mobile application focused on surveys has been developed, and additionally, the main functionalities related to the interoperability were tested