933 resultados para Viscous Dampers,Five Step Method,Equivalent Static Analysis Procedure,Yielding Frames,Passive Energy Dissipation Systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper gives a detailed account of the content analysis method developed at Queen's University Belfast to measure critical thinking during group learning, as used in our controlled comparisons between learning in face-to-face and computer conference seminars. From Garrison's 5 stages of critical thinking, and Henri's cognitive skills needed in CMC, we have developed two research instruments: a student questionnaire and this content analysis method. The content analysis relies on identifying, within transcripts, examples of indicators of obviously critical and obviously uncritical thinking, from which several critical thinking ratios can be calculated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method extending narrative analysis with grounded theory analysis is proposed to bridge the gap between breadth and depth in IS narrative research. The purpose of the method is not to develop a theory but to make narrative analysis more accessible, transparent and accountable; and the resultant narrative more contextually grounded. The method is aimed particularly at inexperienced narrative researchers who currently lack guidance through the complexity of narrative analysis, but may also benefit experienced narrative researchers who may not be familiar with the applicability of grounded theory tools and techniques in this area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE. To examine internal consistency, refine the response scale, and obtain a linear scoring system for the visual function instrument, the Daily Living Tasks Dependent on Vision (DLTV). METHODS. Data were available from 186 participants with a clinical diagnosis of AMD who completed the 22-item DLTV (DLTV-22) according to four-point ordinal response scale. An independent group of 386 participants with AMD were administered a reduced version of the DLTV with 11 items (DLTV-11), according to a five-point response scale. Rasch analysis was performed on both datasets and used to generate item statistics for measure order, response odds ratios per item and per person, and infit and outfit mean square statistics. The Rasch output from the DLTV-22 was examined to identify redundant items and for factorial validity and person item measure separation reliabilities. RESULTS. The average rating for the DLTV-22 changed monotonically with the magnitude of the latent person trait. The expected versus observed average measures were extremely close, with step calibrations evenly separated for the four-point ordinal scale. In the case of the DLTV-11, step calibrations were not as evenly separated, suggesting that the five-point scale should be reduced to either a four- or three-point scale. Five items in the DLTV-22 were removed, and all 17 remaining items had good infit and outfit mean squares. PCA with residuals from Rasch analysis identified two domains containing 7 and 10 items each. The domains had high person separation reliabilities (0.86 and 0.77 for domains 1 and 2, respectively) and item measure reliabilities (0.99 and 0.98 for domains 1 and 2, respectively). CONCLUSIONS. With the improved internal consistency, establishment of the accuracy and precision of the rating scale for the DLTV and the establishment of a valid domain structure we believe that it constitutes a useful instrument for assessing visual function in older adults with age-related macular degeneration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite ethical and technical concerns, the in vivo method, or more commonly referred to mouse bioassay (MBA), is employed globally as a reference method for phycotoxin analysis in shellfish. This is particularly the case for paralytic shellfish poisoning (PSP) and emerging toxin monitoring. A high-performance liquid chromatography method (HPLC-FLD) has been developed for PSP toxin analysis, but due to difficulties and limitations in the method, this procedure has not been fully implemented as a replacement. Detection of the diarrhetic shellfish poisoning (DSP) toxins has moved towards LC-mass spectrometry (MS) analysis, whereas the analysis of the amnesic shellfish poisoning (ASP) toxin domoic acid is performed by HPLC. Although alternative methods of detection to the MBA have been described, each procedure is specific for a particular toxin and its analogues, with each group of toxins requiring separate analysis utilising different extraction procedures and analytical equipment. In addition, consideration towards the detection of unregulated and emerging toxins on the replacement of the MBA must be given. The ideal scenario for the monitoring of phycotoxins in shellfish and seafood would be to evolve to multiple toxin detection on a single bioanalytical sensing platform, i.e. 'an artificial mouse'. Immunologically based techniques and in particular surface plasmon resonance technology have been shown as a highly promising bioanalytical tool offering rapid, real-time detection requiring minimal quantities of toxin standards. A Biacore Q and a prototype multiplex SPR biosensor have been evaluated for their ability to be fit for purpose for the simultaneous detection of key regulated phycotoxin groups and the emerging toxin palytoxin. Deemed more applicable due to the separate flow channels, the prototype performance for domoic acid, okadaic acid, saxitoxin, and palytoxin calibration curves in shellfish achieved detection limits (IC20) of 4,000, 36, 144 and 46 μg/kg of mussel, respectively. A one-step extraction procedure demonstrated recoveries greater than 80 % for all toxins. For validation of the method at the 95 % confidence limit, the decision limits (CCα) determined from an extracted matrix curve were calculated to be 450, 36 and 24 μg/kg, and the detection capability (CCβ) as a screening method is ≤10 mg/kg, ≤160 μg/kg and ≤400 μg/kg for domoic acid, okadaic acid and saxitoxin, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The term fatigue loads on the Oyster Oscillating Wave Surge Converter (OWSC) is used to describe hydrostatic loads due to water surface elevation with quasi-static changes of state. Therefore a procedure to implement hydrostatic pressure distributions into finite element analysis of the structure is desired. Currently available experimental methods enable one to measure time variant water surface elevation at discrete locations either on or around the body of the scale model during tank tests. This paper discusses the development of a finite element analysis procedure to implement time variant, spatially distributed hydrostatic pressure derived from discretely measured water surface elevation. The developed method can process differently resolved (temporal and spatial) input data and approximate the elevation over the flap faces with user defined properties. The structural loads, namely the forces and moments on the body can then be investigated by post processing the numerical results. This method offers the possibility to process surface elevation or hydrostatic pressure data from computational fluid dynamics simulations and can thus be seen as a first step to a fluid-structure interaction model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, our previous work on Principal Component Analysis (PCA) based fault detection method is extended to the dynamic monitoring and detection of loss-of-main in power systems using wide-area synchrophasor measurements. In the previous work, a static PCA model was built and verified to be capable of detecting and extracting system faulty events; however the false alarm rate is high. To address this problem, this paper uses a well-known ‘time lag shift’ method to include dynamic behavior of the PCA model based on the synchronized measurements from Phasor Measurement Units (PMU), which is named as the Dynamic Principal Component Analysis (DPCA). Compared with the static PCA approach as well as the traditional passive mechanisms of loss-of-main detection, the proposed DPCA procedure describes how the synchrophasors are linearly
auto- and cross-correlated, based on conducting the singular value decomposition on the augmented time lagged synchrophasor matrix. Similar to the static PCA method, two statistics, namely T2 and Q with confidence limits are calculated to form intuitive charts for engineers or operators to monitor the loss-of-main situation in real time. The effectiveness of the proposed methodology is evaluated on the loss-of-main monitoring of a real system, where the historic data are recorded from PMUs installed in several locations in the UK/Ireland power system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The battle to mitigate Android malware has become more critical with the emergence of new strains incorporating increasingly sophisticated evasion techniques, in turn necessitating more advanced detection capabilities. Hence, in this paper we propose and evaluate a machine learning based approach based on eigenspace analysis for Android malware detection using features derived from static analysis characterization of Android applications. Empirical evaluation with a dataset of real malware and benign samples show that detection rate of over 96% with a very low false positive rate is achievable using the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La réponse mécanique d’une cellule à une force externe permet d’inférer sa structure et fonction. Les pinces optiques s’avèrent une approche particulièrement attrayante pour la manipulation et caractérisation biophysique sophistiquée des cellules de façon non invasive. Cette thèse explore l’utilisation de trois types de pinces optiques couramment utilisées : 1) statiques (static), 2) à exposition partagée (time-sharing) et 3) oscillantes (oscillating). L’utilisation d’un code basé sur la méthode des éléments finis en trois dimensions (3DFEM) nous permet de modéliser ces trois types de piégeage optique afin d’extraire les propriétés mécaniques cellulaires à partir des expériences. La combinaison des pinces optiques avec la mécanique des cellules requiert des compétences interdisciplinaires. Une revue des approches expérimentales sur le piégeage optique et les tests unicellulaires est présentée. Les bases théoriques liant l’interaction entre la force radiative optique et la réponse mécanique de la cellule aussi. Pour la première fois, une simulation adaptée (3DFEM) incluant la diffusion lumineuse et la distribution du stress radiatif permet de prédire la déformation d’une cellule biconcave –analogue aux globules rouges—dans un piège statique double (static dual-trap). À l’équilibre, on observe que la déformation finale est donnée par l’espacement entre les deux faisceaux lasers: la cellule peut être étirée ou même comprimée. L’exposition partagée (time-sharing) est la technique qui permet de maintenir plusieurs sites de piégeage simultanément à partir du même faisceau laser. Notre analyse quantitative montre que, même oscillantes, la force optique et la déformation sont omniprésentes dans la cellule : la déformation viscoélastique et la dissipation de l’énergie sont analysées. Une autre cellule-type, la tige cubique, est étudiée : cela nous permet d’élucider de nouvelles propriétés sur la symétrie de la réponse mécanique. Enfin, l’analyse de la déformation résolue en temps dans un piége statique ou à exposition partagée montre que la déformation dépend simultanément de la viscoélasticité, la force externe et sa forme tridimensionnelle. La technique à force oscillante (oscillating tweezers) montre toutefois un décalage temporel, entre la force et la déformation, indépendant de la forme 3D; cette approche donnerait directement accès au tenseur viscoélastique complexe de la cellule.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current industry trend is towards using Commercially available Off-The-Shelf (COTS) based multicores for developing real time embedded systems, as opposed to the usage of custom-made hardware. In typical implementation of such COTS-based multicores, multiple cores access the main memory via a shared bus. This often leads to contention on this shared channel, which results in an increase of the response time of the tasks. Analyzing this increased response time, considering the contention on the shared bus, is challenging on COTS-based systems mainly because bus arbitration protocols are often undocumented and the exact instants at which the shared bus is accessed by tasks are not explicitly controlled by the operating system scheduler; they are instead a result of cache misses. This paper makes three contributions towards analyzing tasks scheduled on COTS-based multicores. Firstly, we describe a method to model the memory access patterns of a task. Secondly, we apply this model to analyze the worst case response time for a set of tasks. Although the required parameters to obtain the request profile can be obtained by static analysis, we provide an alternative method to experimentally obtain them by using performance monitoring counters (PMCs). We also compare our work against an existing approach and show that our approach outperforms it by providing tighter upper-bound on the number of bus requests generated by a task.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The total antioxidant capacity (TAC) of 28 flavoured water samples was assessed by ferric reducing antioxidant potential (FRAP), oxygen radical absorbance capacity (ORAC), trolox equivalent antioxidant capacity (TEAC) and total reactive antioxidant potential (TRAP) methods. It was observed that flavoured waters had higher antioxidant activity than the corresponding natural ones. The observed differences were attributed to flavours, juice and vitamins. Generally, higher TAC contents were obtained on lemon waters and lower values on guava and raspberry flavoured waters. Lower and higher TACs were obtained by TRAP and ORAC method, respectively. Statistical analysis suggested that vitamins and flavours increased the antioxidant content of the commercial waters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study is to show that bone strains due to dynamic mechanical loading during physical activity can be analysed using the flexible multibody simulation approach. Strains within the bone tissue play a major role in bone (re)modeling. Based on previous studies, it has been shown that dynamic loading seems to be more important for bone (re)modeling than static loading. The finite element method has been used previously to assess bone strains. However, the finite element method may be limited to static analysis of bone strains due to the expensive computation required for dynamic analysis, especially for a biomechanical system consisting of several bodies. Further, in vivo implementation of strain gauges on the surfaces of bone has been used previously in order to quantify the mechanical loading environment of the skeleton. However, in vivo strain measurement requires invasive methodology, which is challenging and limited to certain regions of superficial bones only, such as the anterior surface of the tibia. In this study, an alternative numerical approach to analyzing in vivo strains, based on the flexible multibody simulation approach, is proposed. In order to investigate the reliability of the proposed approach, three 3-dimensional musculoskeletal models where the right tibia is assumed to be flexible, are used as demonstration examples. The models are employed in a forward dynamics simulation in order to predict the tibial strains during walking on a level exercise. The flexible tibial model is developed using the actual geometry of the subject’s tibia, which is obtained from 3 dimensional reconstruction of Magnetic Resonance Images. Inverse dynamics simulation based on motion capture data obtained from walking at a constant velocity is used to calculate the desired contraction trajectory for each muscle. In the forward dynamics simulation, a proportional derivative servo controller is used to calculate each muscle force required to reproduce the motion, based on the desired muscle contraction trajectory obtained from the inverse dynamics simulation. Experimental measurements are used to verify the models and check the accuracy of the models in replicating the realistic mechanical loading environment measured from the walking test. The predicted strain results by the models show consistency with literature-based in vivo strain measurements. In conclusion, the non-invasive flexible multibody simulation approach may be used as a surrogate for experimental bone strain measurement, and thus be of use in detailed strain estimation of bones in different applications. Consequently, the information obtained from the present approach might be useful in clinical applications, including optimizing implant design and devising exercises to prevent bone fragility, accelerate fracture healing and reduce osteoporotic bone loss.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We apply to the Senegalese input-output matrix of 1990, disagregated into formal and informal activities, a recently designed structural analytical method (Minimal-Flow-Analysis) which permits to depict the direct and indirect production likanges existing between activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been widely known that a significant part of the bits are useless or even unused during the program execution. Bit-width analysis targets at finding the minimum bits needed for each variable in the program, which ensures the execution correctness and resources saving. In this paper, we proposed a static analysis method for bit-widths in general applications, which approximates conservatively at compile time and is independent of runtime conditions. While most related work focus on integer applications, our method is also tailored and applicable to floating point variables, which could be extended to transform floating point number into fixed point numbers together with precision analysis. We used more precise representations for data value ranges of both scalar and array variables. Element level analysis is carried out for arrays. We also suggested an alternative for the standard fixed-point iterations in bi-directional range analysis. These techniques are implemented on the Trimaran compiler structure and tested on a set of benchmarks to show the results.