929 resultados para Based structure model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The variation of the crystallite structure of several coal chars during gasification in air and carbon dioxide was studied by high-resolution transmission electron microscopy (HRTEM) and X-ray diffraction (XRD) techniques. The XRD analysis of the partially gasified coal chars, based on two approaches, Scherrer's equation and Alexander and Sommer's method, shows a contradictory trend of the variation of the crystallite height with carbon conversion, despite giving a similar trend for the crystallite width change. The HRTEM fringe images of the partially gasified coal chars indicate that large and highly ordered crystallites exist at conversion levels as high as 86%. It is also demonstrated that the crystalline structure of chars can be very different although their pore structures are similar, suggesting a combination of crystalline structure analysis with pore structure analysis in studies of carbon gasification.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We have employed molecular dynamics simulations to study the behavior of virtual polymeric materials under an applied uniaxial tensile load. Through computer simulations, one can obtain experimentally inaccessible information about phenomena taking place at the molecular and microscopic levels. Not only can the global material response be monitored and characterized along time, but the response of macromolecular chains can be followed independently if desired. The computer-generated materials were created by emulating the step-wise polymerization, resulting in self-avoiding chains in 3D with controlled degree of orientation along a certain axis. These materials represent a simplified model of the lamellar structure of semi-crystalline polymers,being comprised of an amorphous region surrounded by two crystalline lamellar regions. For the simulations, a series of materials were created, varying i) the lamella thickness, ii) the amorphous region thickness, iii) the preferential chain orientation, and iv) the degree of packing of the amorphous region. Simulation results indicate that the lamella thickness has the strongest influence on the mechanical properties of the lamella-amorphous structure, which is in agreement with experimental data. The other morphological parameters also affect the mechanical response, but to a smaller degree. This research follows previous simulation work on the crack formation and propagation phenomena, deformation mechanisms at the nanoscale, and the influence of the loading conditions on the material response. Computer simulations can improve the fundamental understanding about the phenomena responsible for the behavior of polymeric materials, and will eventually lead to the design of knowledge-based materials with improved properties.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study develops a theoretical model that explains the effectiveness of the balanced scorecard approach by means of a system dynamics and feedback learning perspective. Presumably, the balanced scorecard leads to a better understanding of context, allowing managers to externalize and improve their mental models. We present a set of hypotheses about the influence of the balanced scorecard approach on mental models and performance. A test based on a simulation experiment that uses a system dynamics model is performed. The experiment included three types of parameters: financial indicators; balanced scorecard indicators; and balanced scorecard indicators with the aid of a strategy map review. Two out of the three hypotheses were confirmed. It was concluded that a strategy map review positively influences mental model similarity, and mental model similarity positively influences performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Novel alternating copolymers comprising biscalix[4]arene-p-phenylene ethynylene and m-phenylene ethynylene units (CALIX-m-PPE) were synthesized using the Sonogashira-Hagihara cross-coupling polymerization. Good isolated yields (60-80%) were achieved for the polymers that show M-n ranging from 1.4 x 10(4) to 5.1 x 10(4) gmol(-1) (gel permeation chromatography analysis), depending on specific polymerization conditions. The structural analysis of CALIX-m-PPE was performed by H-1, C-13, C-13-H-1 heteronuclear single quantum correlation (HSQC), C-13-H-1 heteronuclear multiple bond correlation (HMBC), correlation spectroscopy (COSY), and nuclear overhauser effect spectroscopy (NOESY) in addition to Fourier transform-Infrared spectroscopy and microanalysis allowing its full characterization. Depending on the reaction setup, variable amounts (16-45%) of diyne units were found in polymers although their photophysical properties are essentially the same. It is demonstrated that CALIX-m-PPE does not form ground-or excited-state interchain interactions owing to the highly crowded environment of the main-chain imparted by both calix[4]arene side units which behave as insulators inhibiting main-chain pi-pi staking. It was also found that the luminescent properties of CALIX-m-PPE are markedly different from those of an all-p-linked phenylene ethynylene copolymer (CALIX-p-PPE) previously reported. The unexpected appearance of a low-energy emission band at 426 nm, in addition to the locally excited-state emission (365 nm), together with a quite low fluorescence quantum yield (Phi = 0.02) and a double-exponential decay dynamics led to the formulation of an intramolecular exciplex as the new emissive species.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper a realistic directional channel model that is an extension of the COST 273 channel model is presented. The model uses a cluster of scatterers and visibility region generation based strategy with increased realism, due to the introduction of terrain and clutter information. New approaches for path-loss prediction and line of sight modeling are considered, affecting the cluster path gain model implementation. The new model was implemented using terrain, clutter, street and user mobility information for the city of Lisbon, Portugal. Some of the model's outputs are presented, mainly path loss and small/large-scale fading statistics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is proposed a new approach based on a methodology, assisted by a tool, to create new products in the automobile industry based on previous defined processes and experiences inspired on a set of best practices or principles: it is based on high-level models or specifications; it is component-based architecture centric; it is based on generative programming techniques. This approach follows in essence the MDA (Model Driven Architecture) philosophy with some specific characteristics. We propose a repository that keeps related information, such as models, applications, design information, generated artifacts and even information concerning the development process itself (e.g., generation steps, tests and integration milestones). Generically, this methodology receives the users' requirements to a new product (e.g., functional, non-functional, product specification) as its main inputs and produces a set of artifacts (e.g., design parts, process validation output) as its main output, that will be integrated in the engineer design tool (e.g. CAD system) facilitating the work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents MASCEM - Multi-Agent Simulator for Electricity Markets improvement towards an enlarged model for Seller Agents coalitions. The simulator has been improved, both regarding its user interface and internal structure. The OOA, used as development platform, version was updated and the multi-agent model was adjusted for implementing and testing several negotiations regarding Seller agents’ coalitions. Seller coalitions are a very important subject regarding the increased relevance of Distributed Generation under liberalised electricity markets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Involving groups in important management processes such as decision making has several advantages. By discussing and combining ideas, counter ideas, critical opinions, identified constraints, and alternatives, a group of individuals can test potentially better solutions, sometimes in the form of new products, services, and plans. In the past few decades, operations research, AI, and computer science have had tremendous success creating software systems that can achieve optimal solutions, even for complex problems. The only drawback is that people don’t always agree with these solutions. Sometimes this dissatisfaction is due to an incorrect parameterization of the problem. Nevertheless, the reasons people don’t like a solution might not be quantifiable, because those reasons are often based on aspects such as emotion, mood, and personality. At the same time, monolithic individual decisionsupport systems centered on optimizing solutions are being replaced by collaborative systems and group decision-support systems (GDSSs) that focus more on establishing connections between people in organizations. These systems follow a kind of social paradigm. Combining both optimization- and socialcentered approaches is a topic of current research. However, even if such a hybrid approach can be developed, it will still miss an essential point: the emotional nature of group participants in decision-making tasks. We’ve developed a context-aware emotion based model to design intelligent agents for group decision-making processes. To evaluate this model, we’ve incorporated it in an agent-based simulator called ABS4GD (Agent-Based Simulation for Group Decision), which we developed. This multiagent simulator considers emotion- and argument based factors while supporting group decision-making processes. Experiments show that agents endowed with emotional awareness achieve agreements more quickly than those without such awareness. Hence, participant agents that integrate emotional factors in their judgments can be more successful because, in exchanging arguments with other agents, they consider the emotional nature of group decision making.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Solution enthalpies of 1,4-dioxane have been obtained in 15 protic and aprotic solvents at 298.15 K. Breaking the overall process through the use of Solomonov's methodology the cavity term was calculated and interaction enthalpies (Delta H-int) were determined. Main factors involved in the interaction enthalpy have been identified and quantified using a QSPR approach based on the TAKA model equation. The relevant descriptors were found to be pi* and beta, which showed, respectively, exothermic and endothermic contributions. The magnitude of pi* coefficient points toward non-specific solute-solvent interactions playing a major role in the solution process. The positive value of the beta coefficient reflects the endothermic character of the solvents' hydrogen bond acceptor (HBA) basicity contribution, indicating that solvent molecules engaged in hydrogen bonding preferentially interact with each other rather than with 1,4-dioxane. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents the Fuzzy Monte Carlo Model for Transmission Power Systems Reliability based studies (FMC-TRel) methodology, which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states. This is followed by a remedial action algorithm, based on Optimal Power Flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. For the system states that cause load curtailment, an optimization approach is applied to reduce the probability of occurrence of these states while minimizing the costs to achieve that reduction. This methodology is of most importance for supporting the transmission system operator decision making, namely in the identification of critical components and in the planning of future investments in the transmission power system. A case study based on Reliability Test System (RTS) 1996 IEEE 24 Bus is presented to illustrate with detail the application of the proposed methodology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many-core platforms based on Network-on-Chip (NoC [Benini and De Micheli 2002]) present an emerging technology in the real-time embedded domain. Although the idea to group the applications previously executed on separated single-core devices, and accommodate them on an individual many-core chip offers various options for power savings, cost reductions and contributes to the overall system flexibility, its implementation is a non-trivial task. In this paper we address the issue of application mapping onto a NoCbased many-core platform when considering fundamentals and trends of current many-core operating systems, specifically, we elaborate on a limited migrative application model encompassing a message-passing paradigm as a communication primitive. As the main contribution, we formulate the problem of real-time application mapping, and propose a three-stage process to efficiently solve it. Through analysis it is assured that derived solutions guarantee the fulfilment of posed time constraints regarding worst-case communication latencies, and at the same time provide an environment to perform load balancing for e.g. thermal, energy, fault tolerance or performance reasons.We also propose several constraints regarding the topological structure of the application mapping, as well as the inter- and intra-application communication patterns, which efficiently solve the issues of pessimism and/or intractability when performing the analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Moving towards autonomous operation and management of increasingly complex open distributed real-time systems poses very significant challenges. This is particularly true when reaction to events must be done in a timely and predictable manner while guaranteeing Quality of Service (QoS) constraints imposed by users, the environment, or applications. In these scenarios, the system should be able to maintain a global feasible QoS level while allowing individual nodes to autonomously adapt under different constraints of resource availability and input quality. This paper shows how decentralised coordination of a group of autonomous interdependent nodes can emerge with little communication, based on the robust self-organising principles of feedback. Positive feedback is used to reinforce the selection of the new desired global service solution, while negative feedback discourages nodes to act in a greedy fashion as this adversely impacts on the provided service levels at neighbouring nodes. The proposed protocol is general enough to be used in a wide range of scenarios characterised by a high degree of openness and dynamism where coordination tasks need to be time dependent. As the reported results demonstrate, it requires less messages to be exchanged and it is faster to achieve a globally acceptable near-optimal solution than other available approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper discusses the increased need to support dynamic task-level parallelism in embedded real-time systems and proposes a Java framework that combines the Real-Time Specification for Java (RTSJ) with the Fork/Join (FJ) model, following a fixed priority-based scheduling scheme. Our work intends to support parallel runtimes that will coexist with a wide range of other complex independently developed applications, without any previous knowledge about their real execution requirements, number of parallel sub-tasks, and when those sub-tasks will be generated.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper explores the management structure of the team-based organization. First it provides a theoretical model of structures and processes of work teams. The structure determines the team’s responsibilities in terms of authority and expertise about specific regulation tasks. The responsiveness of teams to these responsibilities are the processes of teamwork, in terms of three dimensions, indicating to what extent teams indeed use the space provided to them. The research question that this paper addresses is to what extent the position of responsibilities in the team-based organization affect team responsiveness. This is done by two hypotheses. First, the effect of the so-called proximity of regulation tasks is tested. It is expected that the responsibility for tasks positioned higher in the organization (i.e. further from the team) generally has a negative effect on team responsiveness, whereas tasks positioned lower in the organization (i.e. closer to the team) will have a positive effect on the way in which teams respond. Second, the relationship between the number of tasks for which the team is responsible with team responsiveness is tested. Theory suggests that teams being responsible for a larger number of tasks perform better, i.e. show higher responsiveness. These hypotheses are tested by a study of 109 production teams in the automotive industry. The results show that, as the theory predicts, increasing numbers of responsibilities have positive effects on team responsiveness. However, the delegation of expertise to teams seems to be the most important predictor of responsiveness. Also, not all regulation tasks show to have effects on team responsiveness. Most tasks do not show to have any significant effect at all. A number of tasks affects team responsiveness positively, when their responsibility is positioned lower in the organization, but also a number of tasks affects team responsiveness positively when located higher in the organization, i.e. further from the teams in the production. The results indicate that more attention can be paid to the distribution of responsibilities, in particular expertise, to teams. Indeed delegating more expertise improve team responsiveness, however some tasks might be located better at higher organizational levels, indicating that there are limitations to what responsibilities teams can handle.