974 resultados para IDEAL Reference Model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Existing masonry structures are usually associated to a high seismic vulnerability, mainly due to the properties of the materials, weak connections between floors and load-bearing walls, high mass of the masonry walls and flexibility of the floors. For these reasons, the seismic performance of existing masonry structures has received much attention in the last decades. This study presents the parametric analysis taking into account the deviations on features of the gaioleiro buildings - Portuguese building typology. The main objective of the parametric analysis is to compare the seismic performance of the structure as a function of the variations of its properties with respect to the response of a reference model. The parametric analysis was carried out for two types of structural analysis, namely for the non-linear dynamic analysis with time integration and for the pushover analysis with distribution of forces proportional to the inertial forces of the structure. The Young's modulus of the masonry walls, Young's modulus of the timber floors, the compressive and tensile non-linear properties (strength and fracture energy) were the properties considered in both type of analysis. Additionally, in the dynamic analysis, the influences of the vis-cous damping and of the vertical component of the earthquake were evaluated. A pushover analysis proportional to the modal displacement of the first mode in each direction was also carried out. The results shows that the Young's modulus of the masonry walls, the Young's modulus of the timber floors and the compressive non-linear properties are the pa-rameters that most influence the seismic performance of this type of tall and weak existing masonry structures. Furthermore, it is concluded that that the stiffness of the floors influences significantly the strength capacity and the collapse mecha-nism of the numerical model. Thus, a study on the strengthening of the floors was also carried out. The increase of the thickness of the timber floors was the strengthening technique that presented the best seismic performance, in which the reduction of the out-of-plane displacements of the masonry walls is highlighted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Considering that vernacular architecture may bear important lessons on hazard mitigation and that well-constructed examples showing traditional seismic resistant features can present far less vulnerability than expected, this study aims at understanding the resisting mechanisms and seismic behavior of vernacular buildings through detailed finite element modeling and nonlinear static (pushover) analysis. This paper focuses specifically on a type of vernacular rammed earth constructions found in the Portuguese region of Alentejo. Several rammed earth constructions found in the region were selected and studied in terms of dimensions, architectural layout, structural solutions, construction materials and detailing and, as a result, a reference model was built, which intends to be a simplified representative example of these constructions, gathering the most common characteristics. Different parameters that may affect the seismic response of this type of vernacular constructions have been identified and a numerical parametric study was defined aiming at evaluating and quantifying their influence in the seismic behavior of this type of vernacular buildings. This paper is part of an ongoing research which includes the development of a simplified methodology for assessing the seismic vulnerability of vernacular buildings, based on vulnerability index evaluation methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Biomédica

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: The study objective was to derive reference pharmacokinetic curves of antiretroviral drugs (ART) based on available population pharmacokinetic (Pop-PK) studies that can be used to optimize therapeutic drug monitoring guided dosage adjustment.¦Methods: A systematic search of Pop-PK studies of 8 ART in adults was performed in PubMed. To simulate reference PK curves, a summary of the PK parameters was obtained for each drug based on meta-analysis approach. Most models used one-compartment model, thus chosen as reference model. Models using bi-exponential disposition were simplified to one-compartment, since the first distribution phase was rapid and not determinant for the description of the terminal elimination phase, mostly relevant for this project. Different absorption were standardized for first-order absorption processes.¦Apparent clearance (CL), apparent volume of distribution of the terminal phase (Vz) and absorption rate constant (ka) and inter-individual variability were pooled into summary mean value, weighted by number of plasma levels; intra-individual variability was weighted by number of individuals in each study.¦Simulations based on summary PK parameters served to construct concentration PK percentiles (NONMEM®).¦Concordance between individual and summary parameters was assessed graphically using Forest-plots. To test robustness, difference in simulated curves based on published and summary parameters was calculated using efavirenz as probe drug.¦Results: CL was readily accessible from all studies. For studies with one-compartment, Vz was central volume of distribution; for two-compartment, Vz was CL/λz. ka was directly used or derived based on the mean absorption time (MAT) for more complicated absorption models, assuming MAT=1/ka.¦The value of CL for each drug was in excellent agreement throughout all Pop-PK models, suggesting that minimal concentration derived from summary models was adequately characterized. The comparison of the concentration vs. time profile for efavirenz between published and summary PK parameters revealed not more than 20% difference. Although our approach appears adequate for estimation of elimination phase, the simplification of absorption phase might lead to small bias shortly after drug intake.¦Conclusions: Simulated reference percentile curves based on such an approach represent a useful tool for interpretating drug concentrations. This Pop-PK meta-analysis approach should be further validated and could be extended to elaborate more sophisticated computerized tool for the Bayesian TDM of ART.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Today, information technology is strategically important to the goals and aspirations of the business enterprises, government and high-level education institutions – university. Universities are facing new challenges with the emerging global economy characterized by the importance of providing faster communication services and improving the productivity and effectiveness of individuals. New challenges such as provides an information network that supports the demands and diversification of university issues. A new network architecture, which is a set of design principles for build a network, is one of the pillar bases. It is the cornerstone that enables the university’s faculty, researchers, students, administrators, and staff to discover, learn, reach out, and serve society. This thesis focuses on the network architecture definitions and fundamental components. Three most important characteristics of high-quality architecture are that: it’s open network architecture; it’s service-oriented characteristics and is an IP network based on packets. There are four important components in the architecture, which are: Services and Network Management, Network Control, Core Switching and Edge Access. The theoretical contribution of this study is a reference model Architecture of University Campus Network that can be followed or adapted to build a robust yet flexible network that respond next generation requirements. The results found are relevant to provide an important complete reference guide to the process of building campus network which nowadays play a very important role. Respectively, the research gives university networks a structured modular model that is reliable, robust and can easily grow.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of virtual learning environments it’s more and more frequent in all education levels. However, this increasing use of such environments also implies that the different stages now used in the processes of teaching-learning need to be considered. Student users in a virtual learning environment are faced, not only to the problems related to acquire the knowledge of their course, but also to technological problems as information overloading, getting used to web surfing, computer use, etc. One way to minimize the impact caused by heterogeneity existing in virtual learning environments is to adapt several aspects to the specific characteristics from the user and his context. From this point of view, this work shows a model for an integral user that has been used to generate a virtual course that can interoperate between ELearning platforms. This course has been created using the SCORM reference model and the IMSLD specification

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The competitiveness of businesses is increasingly dependent on their electronic networks with customers, suppliers, and partners. While the strategic and operational impact of external integration and IOS adoption has been extensively studied, much less attention has been paid to the organizational and technical design of electronic relationships. The objective of our longitudinal research project is the development of a framework for understanding and explaining B2B integration. Drawing on existing literature and empirical cases we present a reference model (a classification scheme for B2B Integration). The reference model comprises technical, organizational, and institutional levels to reflect the multiple facets of B2B integration. In this paper we onvestigate the current state of electronic collaboration in global supply chains focussing on the technical view. Using an indepth case analysis we identify five integration scenarios. In the subsequent confirmatory phase of the research we analyse 112 real-world company cases to validate these five integration scenarios. Our research advances and deepens existing studies by developing a B2B reference model, which reflects the current state of practice and is independent of specific implementation technologies. In the next stage of the research the emerging reference model will be extended to create an assessment model for analysing the maturity level of a given company in a specific supply chain.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Considering normal maps that geographers do, it is possible to discuss strange maps in confrontation with the status of real maps. It is by using a precise epistemological model inspired by S. Lupasco that we will examine four relevant examples of fictuous maps. We will refer explicitly to a ternary epistemological model to confront extravagant situations furnished by litterature in regard to our normal maps. By using an explicite ternary language in relation here to the structure of the map, we will refer to the two fundamental concepts of scale and legend to show how those four fictions are emblematic and help us to understand the real maps. The ternary model scale/legend/mapmaking will be the reference model to illustrate how the Captain's map in L. Carroll, the Utopia Island of T. More, the Map of China of J.-L. Borgès and the Tender Map of M. de Scudéry are referenced in our model in regard of the real map, tertium datur, the mediator in the interaction of scale and legend.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkimus käsittelee tuotekehitys- ja innovaatioprosessin kehittämistä ja sen vakiinnuttamista kohdeyrityksissä A ja B. Työssä luodaan ensin kirjallisuustutkimuksena yleinen teoreettinen viitekehys ja lähtötilannemalli tuotekehitys- ja innovaatioprosessin referenssimallille. Tämän vaiheen aikanakäsitellään erilaisia elementtejä ja vaiheita, joita tarvitaan kehitysprosessinkuvaamiseen ja sen kehittämiseen. Prosessimallissa ovat keskeisessä osassa päätöspisteet, joiden arviointikriteereitä ja -tekniikoita työssä käsitellään osana portfolion hallinnan eri mahdollisuuksia. Kehitettyä teoreettista mallia lähdetään implementoimaan työn toisessa osassa kohdeyrityksiin A ja B. Implementoinninyhteydessä käydään läpi sekä prosessikuvaus että siihen liittyvät päätöskriteerit. Tutkimuksen lopputuloksena on yrityksille tuotettu esitys innovaatioprosessista ja sen eri osa-alueista tarkemmalla tasolla sekä tutkimuksen puitteissa rakennettu portfoliotyökalu, jolla kehitysprojekteja voidaan hallinnoida niiden eri vaiheissa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Concerning process control of batch cooling crystallization the present work focused on the cooling profile and seeding technique. Secondly, the influence of additives on batch-wise precipitation process was investigated. Moreover, a Computational Fluid Dynamics (CFD) model for simulation of controlled batch cooling crystallization was developed. A novel cooling model to control supersaturation level during batch-wise cooling crystallization was introduced. The crystallization kinetics together with operating conditions, i.e. seed loading, cooling rate and batch time, were taken into account in the model. Especially, the supersaturation- and suspension density- dependent secondary nucleation was included in the model. The interaction between the operating conditions and their influence on the control target, i.e. the constant level of supersaturation, were studied with the aid of a numerical solution for the cooling model. Further, the batch cooling crystallization was simulated with the ideal mixing model and CFD model. The moment transformation of the population balance, together with the mass and heat balances, were solved numerically in the simulation. In order to clarify a relationship betweenthe operating conditions and product sizes, a system chart was developed for anideal mixing condition. The utilization of the system chart to determine the appropriate operating condition to meet a required product size was introduced. With CFD simulation, batch crystallization, operated following a specified coolingmode, was studied in the crystallizers having different geometries and scales. The introduced cooling model and simulation results were verified experimentallyfor potassium dihydrogen phosphate (KDP) and the novelties of the proposed control policies were demonstrated using potassium sulfate by comparing with the published results in the literature. The study on the batch-wise precipitation showed that immiscible additives could promote the agglomeration of a derivative of benzoic acid, which facilitated the filterability of the crystal product.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uudet palvelut ovat tarkeinta, mita asiakkaat odottavat uudelta teknologialta.Se on paaasiallinen syy siihen, etta asiakkaat ovat valmiita maksamaan uudesta teknologiasta ja kayttamaan sita. Sen vuoksi uuden verkon tuoma uusi palveluarkkitehtuuri on tarkea koko projektin onnistumiselle. Tama dokumentti keskittyy kolmannen sukupolven matkapuhelinverkkojen palveluarkkitehtuuriin, jonka viitemallista annetaan kuvaus. Verkon palvelut esitellaan ja kuvaillaan. Toteutukseen liittyvia asioita selostetaan. USA:n markkinoilla tarvittava WIN konsepti kuvataan ja sen toteutuksesta annetaan myos kuvaus. Lopussa kuvataan Pre-Paid tilaajien laskutustietojen kasittelya WIN konseptissa elvytystilanteessa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A large amount of data for inconspicuous taxa is stored in natural history collections; however, this information is often neglected for biodiversity patterns studies. Here, we evaluate the performance of direct interpolation of museum collections data, equivalent to the traditional approach used in bryophyte conservation planning, and stacked species distribution models (S-SDMs) to produce reliable reconstructions of species richness patterns, given that differences between these methods have been insufficiently evaluated for inconspicuous taxa. Our objective was to contrast if species distribution models produce better inferences of diversity richness than simply selecting areas with the higher species numbers. As model species, we selected Iberian species of the genus Grimmia (Bryophyta), and we used four well-collected areas to compare and validate the following models: 1) four Maxent richness models, each generated without the data from one of the four areas, and a reference model created using all of the data and 2) four richness models obtained through direct spatial interpolation, each generated without the data from one area, and a reference model created with all of the data. The correlations between the partial and reference Maxent models were higher in all cases (0.45 to 0.99), whereas the correlations between the spatial interpolation models were negative and weak (-0.3 to -0.06). Our results demonstrate for the first time that S-SDMs offer a useful tool for identifying detailed richness patterns for inconspicuous taxa such as bryophytes and improving incomplete distributions by assessing the potential richness of under-surveyed areas, filling major gaps in the available data. In addition, the proposed strategy would enhance the value of the vast number of specimens housed in biological collections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämä työ on tehty osana MASTO-tutkimushanketta, jonka tarkoituksena on kehittää ohjelmistotestauksen adaptiivinen referenssimalli. Työ toteutettiin tilastollisena tutkimuksena käyttäen survey-menetelmää. Tutkimuksessa haastateltiin 31 organisaatioyksikköä eri puolelta suomea, jotka tekevät keskikriittisiä sovelluksia. Tutkimuksen hypoteeseina oli laadun riippuvuus ohjelmistokehitysmenetelmästä, asiakkaan osallistumisesta, standardin toteutumisesta, asiakassuhteesta, liiketoimintasuuntautuneisuudesta, kriittisyydestä, luottamuksesta ja testauksen tasosta. Hypoteeseista etsittiin korrelaatiota laadun kanssa tekemällä korrelaatio ja regressioanalyysi. Lisäksi tutkimuksessa kartoitettiin minkälaisia ohjelmistokehitykseen liittyviä käytäntöjä, menetelmiä ja työkaluja organisaatioyksiköissä käytettiin, ongelmia ja parannusehdotuksia liittyen ohjelmistotestaukseen, merkittävimpiä tapoja asiakkaan vaikuttamiseksi ohjelmiston laatuun sekä suurimpia hyötyjä ja haittoja ohjelmistokehityksen tai testauksen ulkoistamisessa. Tutkimuksessa havaittiin, että laatu korreloi positiivisesti ja tilastollisesti merkitsevästi testauksen tason, standardin toteutumisen, asiakasosallistumisen suunnitteluvaiheessa sekä asiakasosallistumisen ohjaukseen kanssa, luottamuksen ja yhden asiakassuhteeseen liittyvän osakysymyksen kanssa. Regressioanalyysin perusteella muodostettiin regressioyhtälö, jossa laadun todettiin positiivisesti riippuvan standardin toteutumisesta, asiakasosallistumisesta suunnitteluvaiheessa sekä luottamuksesta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Technology scaling has proceeded into dimensions in which the reliability of manufactured devices is becoming endangered. The reliability decrease is a consequence of physical limitations, relative increase of variations, and decreasing noise margins, among others. A promising solution for bringing the reliability of circuits back to a desired level is the use of design methods which introduce tolerance against possible faults in an integrated circuit. This thesis studies and presents fault tolerance methods for network-onchip (NoC) which is a design paradigm targeted for very large systems-onchip. In a NoC resources, such as processors and memories, are connected to a communication network; comparable to the Internet. Fault tolerance in such a system can be achieved at many abstraction levels. The thesis studies the origin of faults in modern technologies and explains the classification to transient, intermittent and permanent faults. A survey of fault tolerance methods is presented to demonstrate the diversity of available methods. Networks-on-chip are approached by exploring their main design choices: the selection of a topology, routing protocol, and flow control method. Fault tolerance methods for NoCs are studied at different layers of the OSI reference model. The data link layer provides a reliable communication link over a physical channel. Error control coding is an efficient fault tolerance method especially against transient faults at this abstraction level. Error control coding methods suitable for on-chip communication are studied and their implementations presented. Error control coding loses its effectiveness in the presence of intermittent and permanent faults. Therefore, other solutions against them are presented. The introduction of spare wires and split transmissions are shown to provide good tolerance against intermittent and permanent errors and their combination to error control coding is illustrated. At the network layer positioned above the data link layer, fault tolerance can be achieved with the design of fault tolerant network topologies and routing algorithms. Both of these approaches are presented in the thesis together with realizations in the both categories. The thesis concludes that an optimal fault tolerance solution contains carefully co-designed elements from different abstraction levels