877 resultados para MODELING SYSTEM


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a large data set of high-cadence dMe flare light curves obtained with custom continuum filters on the triple-beam, high-speed camera system ULTRACAM. The measurements provide constraints for models of the near-ultraviolet (NUV) and optical continuum spectral evolution on timescales of ≈1 s. We provide a robust interpretation of the flare emission in the ULTRACAM filters using simultaneously obtained low-resolution spectra during two moderate-sized flares in the dM4.5e star YZ CMi. By avoiding the spectral complexity within the broadband Johnson filters, the ULTRACAM filters are shown to characterize bona fide continuum emission in the NUV, blue, and red wavelength regimes. The NUV/blue flux ratio in flares is equivalent to a Balmer jump ratio, and the blue/red flux ratio provides an estimate for the color temperature of the optical continuum emission. We present a new “color-color” relationship for these continuum flux ratios at the peaks of the flares. Using the RADYN and RH codes, we interpret the ULTRACAM filter emission using the dominant emission processes from a radiative-hydrodynamic flare model with a high nonthermal electron beam flux, which explains a hot, T ≈ 104 K, color temperature at blue-to-red optical wavelengths and a small Balmer jump ratio as observed in moderate-sized and large flares alike. We also discuss the high time resolution, high signal-to-noise continuum color variations observed in YZ CMi during a giant flare, which increased the NUV flux from this star by over a factor of 100. Based on observations obtained with the Apache Point Observatory 3.5 m telescope, which is owned and operated by the Astrophysical Research Consortium, based on observations made with the William Herschel Telescope operated on the island of La Palma by the Isaac Newton Group in the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofsica de Canarias, and observations, and based on observations made with the ESO Telescopes at the La Silla Paranal Observatory under programme ID 085.D-0501(A).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contemporary world is crowded of large, interdisciplinary, complex systems made of other systems, personnel, hardware, software, information, processes, and facilities. The Systems Engineering (SE) field proposes an integrated holistic approach to tackle these socio-technical systems that is crucial to take proper account of their multifaceted nature and numerous interrelationships, providing the means to enable their successful realization. Model-Based Systems Engineering (MBSE) is an emerging paradigm in the SE field and can be described as the formalized application of modelling principles, methods, languages, and tools to the entire lifecycle of those systems, enhancing communications and knowledge capture, shared understanding, improved design precision and integrity, better development traceability, and reduced development risks. This thesis is devoted to the application of the novel MBSE paradigm to the Urban Traffic & Environment domain. The proposed system, the GUILTE (Guiding Urban Intelligent Traffic & Environment), deals with a present-day real challenging problem “at the agenda” of world leaders, national governors, local authorities, research agencies, academia, and general public. The main purposes of the system are to provide an integrated development framework for the municipalities, and to support the (short-time and real-time) operations of the urban traffic through Intelligent Transportation Systems, highlighting two fundamental aspects: the evaluation of the related environmental impacts (in particular, the air pollution and the noise), and the dissemination of information to the citizens, endorsing their involvement and participation. These objectives are related with the high-level complex challenge of developing sustainable urban transportation networks. The development process of the GUILTE system is supported by a new methodology, the LITHE (Agile Systems Modelling Engineering), which aims to lightening the complexity and burdensome of the existing methodologies by emphasizing agile principles such as continuous communication, feedback, stakeholders involvement, short iterations and rapid response. These principles are accomplished through a universal and intuitive SE process, the SIMILAR process model (which was redefined at the light of the modern international standards), a lean MBSE method, and a coherent System Model developed through the benchmark graphical modeling languages SysML and OPDs/OPL. The main contributions of the work are, in their essence, models and can be settled as: a revised process model for the SE field, an agile methodology for MBSE development environments, a graphical tool to support the proposed methodology, and a System Model for the GUILTE system. The comprehensive literature reviews provided for the main scientific field of this research (SE/MBSE) and for the application domain (Traffic & Environment) can also be seen as a relevant contribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A integridade do sinal em sistemas digitais interligados de alta velocidade, e avaliada através da simulação de modelos físicos (de nível de transístor) é custosa de ponto vista computacional (por exemplo, em tempo de execução de CPU e armazenamento de memória), e exige a disponibilização de detalhes físicos da estrutura interna do dispositivo. Esse cenário aumenta o interesse pela alternativa de modelação comportamental que descreve as características de operação do equipamento a partir da observação dos sinais eléctrico de entrada/saída (E/S). Os interfaces de E/S em chips de memória, que mais contribuem em carga computacional, desempenham funções complexas e incluem, por isso, um elevado número de pinos. Particularmente, os buffers de saída são obrigados a distorcer os sinais devido à sua dinâmica e não linearidade. Portanto, constituem o ponto crítico nos de circuitos integrados (CI) para a garantia da transmissão confiável em comunicações digitais de alta velocidade. Neste trabalho de doutoramento, os efeitos dinâmicos não-lineares anteriormente negligenciados do buffer de saída são estudados e modulados de forma eficiente para reduzir a complexidade da modelação do tipo caixa-negra paramétrica, melhorando assim o modelo standard IBIS. Isto é conseguido seguindo a abordagem semi-física que combina as características de formulação do modelo caixa-negra, a análise dos sinais eléctricos observados na E/S e propriedades na estrutura física do buffer em condições de operação práticas. Esta abordagem leva a um processo de construção do modelo comportamental fisicamente inspirado que supera os problemas das abordagens anteriores, optimizando os recursos utilizados em diferentes etapas de geração do modelo (ou seja, caracterização, formulação, extracção e implementação) para simular o comportamento dinâmico não-linear do buffer. Em consequência, contributo mais significativo desta tese é o desenvolvimento de um novo modelo comportamental analógico de duas portas adequado à simulação em overclocking que reveste de um particular interesse nas mais recentes usos de interfaces de E/S para memória de elevadas taxas de transmissão. A eficácia e a precisão dos modelos comportamentais desenvolvidos e implementados são qualitativa e quantitativamente avaliados comparando os resultados numéricos de extracção das suas funções e de simulação transitória com o correspondente modelo de referência do estado-da-arte, IBIS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tese descreve uma framework de trabalho assente no paradigma multi-camada para analisar, modelar, projectar e optimizar sistemas de comunicação. Nela se explora uma nova perspectiva acerca da camada física que nasce das relações entre a teoria de informação, estimação, métodos probabilísticos, teoria da comunicação e codificação. Esta framework conduz a métodos de projecto para a próxima geração de sistemas de comunicação de alto débito. Além disso, a tese explora várias técnicas de camada de acesso com base na relação entre atraso e débito para o projeto de redes sem fio tolerantes a atrasos. Alguns resultados fundamentais sobre a interação entre a teoria da informação e teoria da estimação conduzem a propostas de um paradigma alternativo para a análise, projecto e optimização de sistemas de comunicação. Com base em estudos sobre a relação entre a informação recíproca e MMSE, a abordagem descrita na tese permite ultrapassar, de forma inovadora, as dificuldades inerentes à optimização das taxas de transmissão de informação confiáveis em sistemas de comunicação, e permite a exploração da atribuição óptima de potência e estruturas óptimas de pre-codificação para diferentes modelos de canal: com fios, sem fios e ópticos. A tese aborda também o problema do atraso, numa tentativa de responder a questões levantadas pela enorme procura de débitos elevados em sistemas de comunicação. Isso é feito através da proposta de novos modelos para sistemas com codificação de rede (network coding) em camadas acima da sua camada física. Em particular, aborda-se a utilização de sistemas de codificação em rede para canais que variam no tempo e são sensíveis a atrasos. Isso foi demonstrado através da proposta de um novo modelo e esquema adaptativo, cujos algoritmos foram aplicados a sistemas sem fios com desvanecimento (fading) complexo, de que são exemplos os sistemas de comunicação via satélite. A tese aborda ainda o uso de sistemas de codificação de rede em cenários de transferência (handover) exigentes. Isso é feito através da proposta de novos modelos de transmissão WiFi IEEE 801.11 MAC, que são comparados com codificação de rede, e que se demonstram possibilitar transferência sem descontinuidades. Pode assim dizer-se que esta tese, através de trabalho de análise e de propostas suportadas por simulações, defende que na concepção de sistemas de comunicação se devem considerar estratégias de transmissão e codificação que sejam não só próximas da capacidade dos canais, mas também tolerantes a atrasos, e que tais estratégias têm de ser concebidas tendo em vista características do canal e a camada física.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this work is to carry out a comprehensive study on the Western Iberian Margin (WIM) circulation my means of numerical modeling, and to postulate what this circulation will be in the future. The adopted approach was the development of a regional ocean model configuration with high resolution, capable of reproducing the largeand small-scale dynamics of the coastal transition zone. Four numerical experiences were carried out according to these objectives: (1) a climatological run, in order to study the system’s seasonal behavior and its mean state; (2) a run forced with real winds and fluxes for period 2001-2011 in order to study the interannual variability of the system; (3) a run forced with mean fields from Global Climate Models (GCMs) for the present, in order to validate GCMs as adequate forcing for regional ocean modeling; (4) a similar run (3) for period 2071-2100, in order to assess possible consequences of a future climate scenario on the hydrography and dynamics of the WIM. Furthermore, two Lagrangian particle studies were carried out: one in order to trace the origin of the upwelled waters along the WIM; the other in order to portrait the patterns of larval dispersal, accumulation and connectivity. The numerical configuration proved to be adequate in the reproduction of the system’s mean state, seasonal characterization and an interannual variability study. There is prevalence of poleward flow at the slope, which coexists with the upwelling jet during summer, although there is evidence of its shifting offshore, and which is associated with the Mediterranean Water flow at deeper levels, suggesting a barotropic character. From the future climate scenario essay, the following conclusions were drawn: there is general warming and freshening of upper level waters; there is still poleward tendency, and despite the upwellingfavorable winds strengthening in summer the respective coastal band becomes more restricted in width and depth. In what concerns larval connectivity and dispersion along the WIM, diel vertical migration was observed to increase recruitment throughout the domain, and while smooth coastlines are better suppliers, there is higher accumulation where the topography is rougher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes some of the needs and problems associated to assessment of coastal and estuarine problems (sediment transport and eutrophication). The development of an integrated system including EO data, local measurements with special emphasis on modeling tools, is presented as a solution for studying and helping decision making on the subject. Two pilot sites for the implementation and the present development status of the integrated system are depicted. This framework was already presented in a recent AO specific for Portugal, which is still under evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Induced pluripotent stem cells (iPSc) have great potential for applications in regenerative medicine, disease modeling and basic research. Several methods have been developed for their derivation. The original method of Takahashi and Yamanaka involved the use of retroviral vectors which result in insertional mutagenesis, presence in the genome of potential oncogenes and effects of residual transgene expression on differentiation bias of each particular iPSc line. Other methods have been developed, using different viral vectors (adenovirus and Sendai virus), transient plasmid transfection, mRNA transduction, protein transduction and use of small molecules. However, these methods suffer from low efficiencies; can be extremely labor intensive, or both. An additional method makes use of the piggybac transposon, which has the advantage of inserting its payload into the host genome and being perfectly excised upon re-expression of the transposon transposase. Briefly, a policistronic cassette expressing Oct4, Sox2, Klf4 and C-Myc flanked by piggybac terminal repeats is delivered to the cells along with a plasmid transiently expressing piggybac transposase. Once reprogramming occurs, the cells are re-transfected with transposase and subclones free of tranposon integrations screened for. The procedure is therefore very labor intensive, requiring multiple manipulations and successive rounds of cloning and screening. The original method for reprogramming with the the PiggyBac transposon was created by Woltjen et al in 2009 (schematized here) and describes a process with which it is possible to obtain insert-free iPSc. Insert-free iPSc enables the establishment of better cellular models of iPS and adds a new level of security to the use of these cells in regenerative medicine. Due to the fact that it was based on several low efficiency steps, the overall efficiency of the method is very low (<1%). Moreover, the stochastic transfection, integration, excision and the inexistence of an active way of selection leaves this method in need of extensive characterization and screening of the final clones. In this work we aime to develop a non-integrative iPSc derivation system in which integration and excision of the transgenes can be controlled by simple media manipulations, avoiding labor intensive and potentially mutagenic procedures. To reach our goal we developed a two vector system which is simultaneously delivered to original population of fibroblasts. The first vector, Remo I, carries the reprogramming cassette and GFP under the regulation of a constitutive promoter (CAG). The second vector, Eneas, carries the piggybac transposase associated with an estrogen receptor fragment (ERT2), regulated in a TET-OFF fashion, and its equivalent reverse trans-activator associated with a positive-negative selection cassette under a constitutive promoter. We tested its functionality in HEK 293T cells. The protocol is divided in two the following steps: 1) Obtaining acceptable transfection efficiency into human fibroblasts. 2) Testing the functionality of the construct 3) Determining the ideal concentration of DOX for repressing mPB-ERT2 expression 4) Determining the ideal concentration of TM for transposition into the genome 5) Determining the ideal Windows of no DOX/TM pulse for transposition into the genome 6) 3, 4 and 5) for transposition out of the genome 7) Determination of the ideal concentration of GCV for negative selection We successfully demonstrated that ENEAS behaved as expected in terms of DOX regulation of the expression of mPB-ERT2. We also demonstrated that by delivering the plasmid into 293T HEK cells and manipulating the levels of DOX and TM in the medium, we could obtain puromycin resistant lines. The number of puromycin resistant colonies obtained was significantly higher when DOX as absent, suggesting that the colonies resulted from transposition events. Presence of TM added an extra layer of regulation, albeit weaker. Our PCR analysis, while not a clean as would be desired, suggested that transposition was indeed occurring, although a background level of random integration could not be ruled out. Finally, our attempt to determine whether we could use GVC to select clones that had successfully mobilized PB out of the genome was unsuccessful. Unexpectedly, 293T HEK cells that had been transfected with ENEAS and selected for puromycin resistance were insensitive to GCV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-03

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researchers want to analyse Health Care data which may requires large pools of compute and data resources. To have them they need access to Distributed Computing Infrastructures (DCI). To use them it requires expertise which researchers may not have. Workflows can hide infrastructures. There are many workflow systems but they are not interoperable. To learn a workflow system and create workflows in a workflow system may require significant effort. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows of other workflow systems. As a result, the lack of interoperability prevents workflow sharing and a vast amount of research efforts is wasted. The FP7 Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs (SHIWA) project developed the Coarse-Grained Interoperability (CGI) to enable workflow sharing. The project created the SHIWA Simulation Platform (SSP) to support CGI as a production-level service. The paper describes how the CGI approach can be used for analysis and simulation in Health Care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapid developments in display technologies, digital printing, imaging sensors, image processing and image transmission are providing new possibilities for creating and conveying visual content. In an age in which images and video are ubiquitous and where mobile, satellite, and three-dimensional (3-D) imaging have become ordinary experiences, quantification of the performance of modern imaging systems requires appropriate approaches. At the end of the imaging chain, a human observer must decide whether images and video are of a satisfactory visual quality. Hence the measurement and modeling of perceived image quality is of crucial importance, not only in visual arts and commercial applications but also in scientific and entertainment environments. Advances in our understanding of the human visual system offer new possibilities for creating visually superior imaging systems and promise more accurate modeling of image quality. As a result, there is a profusion of new research on imaging performance and perceived quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper proposes a methodology to increase the probability of delivering power to any load point by identifying new investments in distribution energy systems. The proposed methodology is based on statistical failure and repair data of distribution components and it uses a fuzzy-probabilistic modeling for the components outage parameters. The fuzzy membership functions of the outage parameters of each component are based on statistical records. A mixed integer nonlinear programming optimization model is developed in order to identify the adequate investments in distribution energy system components which allow increasing the probability of delivering power to any customer in the distribution system at the minimum possible cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 180 bus distribution network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The large increase of distributed energy resources, including distributed generation, storage systems and demand response, especially in distribution networks, makes the management of the available resources a more complex and crucial process. With wind based generation gaining relevance, in terms of the generation mix, the fact that wind forecasting accuracy rapidly drops with the increase of the forecast anticipation time requires to undertake short-term and very short-term re-scheduling so the final implemented solution enables the lowest possible operation costs. This paper proposes a methodology for energy resource scheduling in smart grids, considering day ahead, hour ahead and five minutes ahead scheduling. The short-term scheduling, undertaken five minutes ahead, takes advantage of the high accuracy of the very-short term wind forecasting providing the user with more efficient scheduling solutions. The proposed method uses a Genetic Algorithm based approach for optimization that is able to cope with the hard execution time constraint of short-term scheduling. Realistic power system simulation, based on PSCAD , is used to validate the obtained solutions. The paper includes a case study with a 33 bus distribution network with high penetration of distributed energy resources implemented in PSCAD .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling (UM), numerous research and developed systems already seem to promise good results, but some experimentation and implementation are still necessary to conclude about the utility of the UM. That is, the experimentation and implementation of these systems are still very scarce to determine the utility of some of the referred applications. At present, the Student Modeling research goes in the direction to make possible reuse a student model in different systems. The standards are more and more relevant for this effect, allowing systems communicate and to share data, components and structures, at syntax and semantic level, even if most of them still only allow syntax integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the measurement, frequency-response modeling and identification, and the corresponding impulse time response of the human respiratory impedance and admittance. The investigated adult patient groups were healthy, diagnosed with chronic obstructive pulmonary disease and kyphoscoliosis, respectively. The investigated children patient groups were healthy, diagnosed with asthma and cystic fibrosis, respectively. Fractional order (FO) models are identified on the measured impedance to quantify the respiratory mechanical properties. Two methods are presented for obtaining and simulating the time-domain impulse response from FO models of the respiratory admittance: (i) the classical pole-zero interpolation proposed by Oustaloup in the early 90s, and (ii) the inverse discrete Fourier Transform (DFT). The results of the identified FO models for the respiratory admittance are presented by means of their average values for each group of patients. Consequently, the impulse time response calculated from the frequency response of the averaged FO models is given by means of the two methods mentioned above. Our results indicate that both methods provide similar impulse response data. However, we suggest that the inverse DFT is a more suitable alternative to the high order transfer functions obtained using the classical Oustaloup filter. Additionally, a power law model is fitted on the impulse response data, emphasizing the intrinsic fractal dynamics of the respiratory system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The self similar branching arrangement of the airways makes the respiratory system an ideal candidate for the application of fractional calculus theory. The fractal geometry is typically characterized by a recurrent structure. This study investigates the identification of a model for the respiratory tree by means of its electrical equivalent based on intrinsic morphology. Measurements were obtained from seven volunteers, in terms of their respiratory impedance by means of its complex representation for frequencies below 5 Hz. A parametric modeling is then applied to the complex valued data points. Since at low-frequency range the inertance is negligible, each airway branch is modeled by using gamma cell resistance and capacitance, the latter having a fractional-order constant phase element (CPE), which is identified from measurements. In addition, the complex impedance is also approximated by means of a model consisting of a lumped series resistance and a lumped fractional-order capacitance. The results reveal that both models characterize the data well, whereas the averaged CPE values are supraunitary and subunitary for the ladder network and the lumped model, respectively.