904 resultados para Thermo dynamic analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The deployment of OECBs (opto-electrical circuit boards) is expected to make a significant impact in the telecomm switches arena within the next five years. This will create optical backplanes with high speed point-to-point optical interconnects. The crucial aspect in the manufacturing process of the optical backplane is the successful coupling between VCSEL (vertical cavity surface emitting laser) device and embedded waveguide in the OECB. The results from a thermo-mechanical analysis are being used in a purely optical model, which solves optical energy and attenuation from the VCSEL aperture into, and then through, the waveguide. Results from the modelling are being investigated using DOE analysis to identify packaging parameters that minimise misalignment. This is achieved via a specialist optimisation software package. Results from the thermomechanical and optical models are discussed as are experimental results from the DOE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

N-gram analysis is an approach that investigates the structure of a program using bytes, characters, or text strings. A key issue with N-gram analysis is feature selection amidst the explosion of features that occurs when N is increased. The experiments within this paper represent programs as operational code (opcode) density histograms gained through dynamic analysis. A support vector machine is used to create a reference model, which is used to evaluate two methods of feature reduction, which are 'area of intersect' and 'subspace analysis using eigenvectors.' The findings show that the relationships between features are complex and simple statistics filtering approaches do not provide a viable approach. However, eigenvector subspace analysis produces a suitable filter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A postbuckling blade-stiffened composite panel was loaded in uniaxial compression, until failure. During loading beyond initial buckling, this panel was observed to undergo a secondary instability characterised by a dynamic mode shape change. These abrupt changes cause considerable numerical difficulties using standard path-following quasi-static solution procedures in finite element analysis. Improved methods such as the arc-length-related procedures do better at traversing certain critical points along an equilibrium path but these procedures may also encounter difficulties in highly non-linear problems. This paper presents a robust, modified explicit dynamic analysis for the modelling of postbuckling structures. This method was shown to predict the mode-switch with good accuracy and is more efficient than standard explicit dynamic analysis. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The inherent difficulty of thread-based shared-memory programming has recently motivated research in high-level, task-parallel programming models. Recent advances of Task-Parallel models add implicit synchronization, where the system automatically detects and satisfies data dependencies among spawned tasks. However, dynamic dependence analysis incurs significant runtime overheads, because the runtime must track task resources and use this information to schedule tasks while avoiding conflicts and races.
We present SCOOP, a compiler that effectively integrates static and dynamic analysis in code generation. SCOOP combines context-sensitive points-to, control-flow, escape, and effect analyses to remove redundant dependence checks at runtime. Our static analysis can work in combination with existing dynamic analyses and task-parallel runtimes that use annotations to specify tasks and their memory footprints. We use our static dependence analysis to detect non-conflicting tasks and an existing dynamic analysis to handle the remaining dependencies. We evaluate the resulting hybrid dependence analysis on a set of task-parallel programs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the results of non-linear elasto-plastic implicit dynamic finite element analyses that are used to predict the collapse behaviour of cold-formed steel portal frames at elevated temperatures. The collapse behaviour of a simple rigid-jointed beam idealisation and a more accurate semi-rigid jointed shell element idealisation are compared for two different fire scenarios. For the case of the shell element idealisation, the semi-rigidity of the cold-formed steel joints is explicitly taken into account through modelling of the bolt-hole elongation stiffness. In addition, the shell element idealisation is able to capture buckling of the cold-formed steel sections in the vicinity of the joints. The shell element idealisation is validated at ambient temperature against the results of full-scale tests reported in the literature. The behaviour at elevated temperatures is then considered for both the semi-rigid jointed shell and rigid-jointed beam idealisations. The inclusion of accurate joint rigidity and geometric non-linearity (second order analysis) are shown to affect the collapse behaviour at elevated temperatures. For each fire scenario considered, the importance of base fixity in preventing an undesirable outwards collapse mechanism is demonstrated. The results demonstrate that joint rigidity and varying fire scenarios should be considered in order to allow for conservative design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

N-gram analysis is an approach that investigates the structure of a program using bytes, characters or text strings. This research uses dynamic analysis to investigate malware detection using a classification approach based on N-gram analysis. A key issue with dynamic analysis is the length of time a program has to be run to ensure a correct classification. The motivation for this research is to find the optimum subset of operational codes (opcodes) that make the best indicators of malware and to determine how long a program has to be monitored to ensure an accurate support vector machine (SVM) classification of benign and malicious software. The experiments within this study represent programs as opcode density histograms gained through dynamic analysis for different program run periods. A SVM is used as the program classifier to determine the ability of different program run lengths to correctly determine the presence of malicious software. The findings show that malware can be detected with different program run lengths using a small number of opcodes

Relevância:

80.00% 80.00%

Publicador:

Resumo:

N-gram analysis is an approach that investigates the structure of a program using bytes, characters or text strings. This research uses dynamic analysis to investigate malware detection using a classification approach based on N-gram analysis. The motivation for this research is to find a subset of Ngram features that makes a robust indicator of malware. The experiments within this paper represent programs as N-gram density histograms, gained through dynamic analysis. A Support Vector Machine (SVM) is used as the program classifier to determine the ability of N-grams to correctly determine the presence of malicious software. The preliminary findings show that an N-gram size N=3 and N=4 present the best avenues for further analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a study on concrete fracture and the associated mesh sensitivity using the finite element (FE) method with a local concrete model in both tension (Mode I) and compression.To enable the incorporation of dynamic loading, the FE model is developed using a transient dynamic analysis code LS-DYNA Explicit.A series of investigations have been conducted on typical fracture scenarios to evaluate the model performances and calibration of relevant parameters.The K&C damage model was adopted because it is a comprehensive local concrete model which allows the user to change the crack band width, fracture energy and rate dependency of the material.Compressive localisation modelling in numerical modelling is also discussed in detail in relation to localisation.An impact test specimen is modelled.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research presented, investigates the optimal set of operational codes (opcodes) that create a robust indicator of malicious software (malware) and also determines a program’s execution duration for accurate classification of benign and malicious software. The features extracted from the dataset are opcode density histograms, extracted during the program execution. The classifier used is a support vector machine and is configured to select those features to produce the optimal classification of malware over different program run lengths. The findings demonstrate that malware can be detected using dynamic analysis with relatively few opcodes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Motivados pelo propósito central de contribuir para a construção, a longo prazo, de um sistema completo de conversão de texto para fala, baseado em síntese articulatória, desenvolvemos um modelo linguístico para o português europeu (PE), com base no sistema TADA (TAsk Dynamic Application), que visou a obtenção automática da trajectória dos articuladores a partir do texto de entrada. A concretização deste objectivo ditou o desenvolvimento de um conjunto de tarefas, nomeadamente 1) a implementação e avaliação de dois sistemas de silabificação automática e de transcrição fonética, tendo em vista a transformação do texto de entrada num formato adequado ao TADA; 2) a criação de um dicionário gestual para os sons do PE, de modo a que cada fone obtido à saída do conversor grafema-fone pudesse ter correspondência com um conjunto de gestos articulatórios adaptados para o PE; 3) a análise do fenómeno da nasalidade à luz dos princípios dinâmicos da Fonologia Articulatória (FA), com base num estudo articulatório e perceptivo. Os dois algoritmos de silabificação automática implementados e testados fizeram apelo a conhecimentos de natureza fonológica sobre a estrutura da sílaba, sendo o primeiro baseado em transdutores de estados finitos e o segundo uma implementação fiel das propostas de Mateus & d'Andrade (2000). O desempenho destes algoritmos – sobretudo do segundo – mostrou-se similar ao de outros sistemas com as mesmas potencialidades. Quanto à conversão grafema-fone, seguimos uma metodologia baseada em regras de reescrita combinada com uma técnica de aprendizagem automática. Os resultados da avaliação deste sistema motivaram a exploração posterior de outros métodos automáticos, procurando também avaliar o impacto da integração de informação silábica nos sistemas. A descrição dinâmica dos sons do PE, ancorada nos princípios teóricos e metodológicos da FA, baseou-se essencialmente na análise de dados de ressonância magnética, a partir dos quais foram realizadas todas as medições, com vista à obtenção de parâmetros articulatórios quantitativos. Foi tentada uma primeira validação das várias configurações gestuais propostas, através de um pequeno teste perceptual, que permitiu identificar os principais problemas subjacentes à proposta gestual. Este trabalho propiciou, pela primeira vez para o PE, o desenvolvimento de um primeiro sistema de conversão de texto para fala, de base articulatória. A descrição dinâmica das vogais nasais contou, quer com os dados de ressonância magnética, para caracterização dos gestos orais, quer com os dados obtidos através de articulografia electromagnética (EMA), para estudo da dinâmica do velo e da sua relação com os restantes articuladores. Para além disso, foi efectuado um teste perceptivo, usando o TADA e o SAPWindows, para avaliar a sensibilidade dos ouvintes portugueses às variações na altura do velo e alterações na coordenação intergestual. Este estudo serviu de base a uma interpretação abstracta (em termos gestuais) das vogais nasais do PE e permitiu também esclarecer aspectos cruciais relacionados com a sua produção e percepção.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Observation-based slicing is a recently-introduced, language-independent, slicing technique based on the dependencies observable from program behaviour. Due to the wellknown limits of dynamic analysis, we may only compute an under-approximation of the true observation-based slice. However, because the observation-based slice captures all possible dependence that can be observed, even such approximations can yield insight into the limitations of static slicing. For example, a static slice, S that is strictly smaller than the corresponding observation based slice is guaranteed to be unsafe. We present the results of three sets of experiments on 12 different programs, including benchmarks and larger programs, which investigate the relationship between static and observation-based slicing. We show that, in extreme cases, observation-based slices can find the true static minimal slice, where static techniques cannot. For more typical cases, our results illustrate the potential for observation-based slicing to highlight unsafe static slices. Finally, we report on the sensitivity of observation-based slicing to test quality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Damage assessment of structures with a mechanical non linear model demands the representation of seismic action in terms of an accelerogram (dynamic analysis) or a response spectrum (pushover analysis). Stochastic ground motion simulation is largely used in regions where seismic strong-motion records are available in insufficient number. In this work we present a variation of the stochastic finite-fault method with dynamic corner frequency that includes the geological site effects. The method was implemented in a computer program named SIMULSIS that generate time series (accelerograms) and response spectra. The program was tested with the MW= 7.3 Landers earthquake (June 28, 1992) and managed to reproduce its effects. In the present work we used it to reproduce the effects of the 1980’s Azores earthquake (January 1, 1980) in several islands, with different possible local site conditions. In those places, the response spectra are presented and compared with the buildings damage observed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização em Estruturas

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trabalho Final de Mestrado elaborado no Laboratório Nacional de Engenharia Civil (LNEC) para a obtenção do grau de Mestre em Engenharia Civil pelo Instituto Superior de Engenharia de Lisboa no âmbito do protocolo de cooperação entre o ISEL e o LNEC