980 resultados para DYNAMIC TEST
Resumo:
The dynamic interaction of vehicles and bridges results in live loads being induced into bridges that are greater than the vehicle’s static weight. To limit this dynamic effect, the Iowa Department of Transportation (DOT) currently requires that permitted trucks slow to five miles per hour and span the roadway centerline when crossing bridges. However, this practice has other negative consequences such as the potential for crashes, impracticality for bridges with high traffic volumes, and higher fuel consumption. The main objective of this work was to provide information and guidance on the allowable speeds for permitted vehicles and loads on bridges .A field test program was implemented on five bridges (i.e., two steel girder bridges, two pre-stressed concrete girder bridges, and one concrete slab bridge) to investigate the dynamic response of bridges due to vehicle loadings. The important factors taken into account during the field tests included vehicle speed, entrance conditions, vehicle characteristics (i.e., empty dump truck, full dump truck, and semi-truck), and bridge geometric characteristics (i.e., long span and short span). Three entrance conditions were used: As-is and also Level 1 and Level 2, which simulated rough entrance conditions with a fabricated ramp placed 10 feet from the joint between the bridge end and approach slab and directly next to the joint, respectively. The researchers analyzed and utilized the field data to derive the dynamic impact factors (DIFs) for all gauges installed on each bridge under the different loading scenarios.
Resumo:
Introdução: O judô é um esporte que implica uma grande variedade de gestos, ações e aptidões físicas, entre as quais, capacidade de controlo postural, equilíbrio, flexibilidade e força. Quando observada as áreas mais afetas na pratica do judô a região do joelho é das que possui maior incidência. O objetivo deste estudo foi avaliar os efeitos da aplicação do Dynamic Tape (DT), um tape biomecânico, na funcionalidade do quadriceps de atletas de judô masculino com dor não específica no joelho em termos de equilíbrio, força, flexibilidade e dor. Metodologia: A amostra foi constituída por 37 indivíduos, tendo os participantes sido submetidos a testes, primeiramente sem Dynamic Tape (SDT) e posteriormente com Dynamic Tape (CDT). Os testes aplicados foram o Standing Stork Test (SST), o Y Balance Test (YBT), o Four Square Step Test (FSST),o Single Leg Hop Test (SLHT), e o Teste de flexão do membro inferior (TFMI) e o Teste de extensão do membros (TEMI) e a escala numérica de dor (END) no final de todos os testes. Resultados: Não foram observadas diferenças significativas para o teste SST (p=0,6794), porém os teste YBT, SLHT, TFMI, TEMI e END (p<0,0001), assim como FSST (p=0,0026) entre os momentos CDT e SDT demonstraram diferenças estatísticamente significativas, produzindo a aplicação do DT efeitos positivos. Na performance do atleta. Conclusão: A aplicação do DT não foi capaz de melhorar de forma significativa o equilíbrio estático, no então demonstrou influenciar o equilíbrio semi-dinâmico, dinâmico, a flexibilidade e a dor.
Resumo:
A rapidly changing business environment has necessitated most small and medium sized enterprises with international ambitions to reconsider their sources of competitive advantage. To survive in the face of a changing business environment, firms should utilize their dynamic organizational capabilities as well as their internationalization capabilities. Firms develop a competitive advantage if they can exploit their unique organizational competences in a new or foreign market and also if they can acquire new capabilities as a result of engaging in foreign markets. The acquired capabilities from foreign locations enhance the existing capability portfolio of a firm with a desire to internationalize. The study combined the research streams of SME organizational dynamic capability and internationalization capability to build a complete picture on the existing knowledge. An intensive case study was used for empirically testing the theoretical framework of the study and compared with the literature on various organizational capability factors and internationalization capabilities. Sormay Oy was selected because it is a successful medium sized company operating in Finland in the manufacturing industry which has a high international profile. In addition, it has sufficient rate of growth in sales that warrants it to engage internationally in matters such as, acquisitions, joint ventures and partnerships. The key findings of the study suggests that, medium sized manufacturing firms have a set of core competences arising from their organizational capabilities which were identified to be employee know how and relationship with stakeholders which aid the firm in its quest for attaining competitive advantage, ensuring production flexibility and gaining benefits present in a network. In addition, internationalization capabilities were identified under both the RAT test and CAT test whereby the primary findings suggests that, firms that outperform their competitors produce products that meet specific customer and country requirements, foresee the pitfalls of imitation brought about by the foreign local companies and members of a particular network through joint ventures, acquisitions or partnerships as well as those firms that are capable to acquire new capabilities in the foreign markets and successfully use these acquired capabilities to enhance or renew their capability portfolio for their competitive advantage. Additional significant findings under internationalization capabilities were discovered whereby, Sormay Oy was able to develop a new market space for its products despite the difficult institutional environment present in Russia.
Resumo:
With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.
Resumo:
The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.
Resumo:
Prior research shows that both cognitive ability (Schmidt & Hunter, 1998) and personality measures (Poropat, 2009; Hough & Furnham, 2003) are valid predictors of job performance. The dynamic nature of the relationships between cognitive ability and personality measures with performance over time spent on the job is less understood and thus this paper explores their relationships. Although there is much research to suggest that the predictive relationship between cognitive ability and performance decreases over years of tenure (e.g., Hulin, Henry, & Noon, 1990), other research suggests that the relationship between cognitive ability and performance will increase over time (Kolz, McFarland, & Silverman, 1988). In regard to personality, this study provides a critical test of two competing theories. The first position holds that the validity of personality degrades over time. Support for this position comes from the “ubiquitous” nature of the simplex pattern in individual differences (Humphreys, 1985). It follows that personality validities should perform like cognitive ability in this respect, and thus decline over time. In contrast to this viewpoint, the alternative position contends that the predictive relationship between personality variables and performance increases over time, with the correlation becoming larger in magnitude and more positive in direction over years of tenure. The results of this study support the latter position; personality validities predicted long term performance outcomes.
Resumo:
People go through their life making all kinds of decisions, and some of these decisions affect their demand for transportation, for example, their choices of where to live and where to work, how and when to travel and which route to take. Transport related choices are typically time dependent and characterized by large number of alternatives that can be spatially correlated. This thesis deals with models that can be used to analyze and predict discrete choices in large-scale networks. The proposed models and methods are highly relevant for, but not limited to, transport applications. We model decisions as sequences of choices within the dynamic discrete choice framework, also known as parametric Markov decision processes. Such models are known to be difficult to estimate and to apply to make predictions because dynamic programming problems need to be solved in order to compute choice probabilities. In this thesis we show that it is possible to explore the network structure and the flexibility of dynamic programming so that the dynamic discrete choice modeling approach is not only useful to model time dependent choices, but also makes it easier to model large-scale static choices. The thesis consists of seven articles containing a number of models and methods for estimating, applying and testing large-scale discrete choice models. In the following we group the contributions under three themes: route choice modeling, large-scale multivariate extreme value (MEV) model estimation and nonlinear optimization algorithms. Five articles are related to route choice modeling. We propose different dynamic discrete choice models that allow paths to be correlated based on the MEV and mixed logit models. The resulting route choice models become expensive to estimate and we deal with this challenge by proposing innovative methods that allow to reduce the estimation cost. For example, we propose a decomposition method that not only opens up for possibility of mixing, but also speeds up the estimation for simple logit models, which has implications also for traffic simulation. Moreover, we compare the utility maximization and regret minimization decision rules, and we propose a misspecification test for logit-based route choice models. The second theme is related to the estimation of static discrete choice models with large choice sets. We establish that a class of MEV models can be reformulated as dynamic discrete choice models on the networks of correlation structures. These dynamic models can then be estimated quickly using dynamic programming techniques and an efficient nonlinear optimization algorithm. Finally, the third theme focuses on structured quasi-Newton techniques for estimating discrete choice models by maximum likelihood. We examine and adapt switching methods that can be easily integrated into usual optimization algorithms (line search and trust region) to accelerate the estimation process. The proposed dynamic discrete choice models and estimation methods can be used in various discrete choice applications. In the area of big data analytics, models that can deal with large choice sets and sequential choices are important. Our research can therefore be of interest in various demand analysis applications (predictive analytics) or can be integrated with optimization models (prescriptive analytics). Furthermore, our studies indicate the potential of dynamic programming techniques in this context, even for static models, which opens up a variety of future research directions.
Resumo:
The aim of this thesis is to test the ability of some correlative models such as Alpert correlations on 1972 and re-examined on 2011, the investigation of Heskestad and Delichatsios in 1978, the correlations produced by Cooper in 1982, to define both dynamic and thermal characteristics of a fire induced ceiling-jet flow. The flow occurs when the fire plume impinges the ceiling and develops in the radial direction of the fire axis. Both temperature and velocity predictions are decisive for sprinklers positioning, fire alarms positions, detectors (heat, smoke) positions and activation times and back-layering predictions. These correlative models will be compared with a 3D numerical simulation software CFAST. For the results comparison of temperature and velocity near the ceiling. These results are also compared with a Computational Fluid Dynamics (CFD) analysis, using ANSYS FLUENT.
Resumo:
Dissertação de Mestrado Integrado em Medicina Veterinária
Resumo:
The goal of this project is to learn the necessary steps to create a finite element model, which can accurately predict the dynamic response of a Kohler Engines Heavy Duty Air Cleaner (HDAC). This air cleaner is composed of three glass reinforced plastic components and two air filters. Several uncertainties arose in the finite element (FE) model due to the HDAC’s component material properties and assembly conditions. To help understand and mitigate these uncertainties, analytical and experimental modal models were created concurrently to perform a model correlation and calibration. Over the course of the project simple and practical methods were found for future FE model creation. Similarly, an experimental method for the optimal acquisition of experimental modal data was arrived upon. After the model correlation and calibration was performed a validation experiment was used to confirm the FE models predictive capabilities.
Resumo:
Traditional decision making research has often focused on one's ability to choose from a set of prefixed options, ignoring the process by which decision makers generate courses of action (i.e., options) in-situ (Klein, 1993). In complex and dynamic domains, this option generation process is particularly critical to understanding how successful decisions are made (Zsambok & Klein, 1997). When generating response options for oneself to pursue (i.e., during the intervention-phase of decision making) previous research has supported quick and intuitive heuristics, such as the Take-The-First heuristic (TTF; Johnson & Raab, 2003). When generating predictive options for others in the environment (i.e., during the assessment-phase of decision making), previous research has supported the situational-model-building process described by Long Term Working Memory theory (LTWM; see Ward, Ericsson, & Williams, 2013). In the first three experiments, the claims of TTF and LTWM are tested during assessment- and intervention-phase tasks in soccer. To test what other environmental constraints may dictate the use of these cognitive mechanisms, the claims of these models are also tested in the presence and absence of time pressure. In addition to understanding the option generation process, it is important that researchers in complex and dynamic domains also develop tools that can be used by `real-world' professionals. For this reason, three more experiments were conducted to evaluate the effectiveness of a new online assessment of perceptual-cognitive skill in soccer. This test differentiated between skill groups and predicted performance on a previously established test and predicted option generation behavior. The test also outperformed domain-general cognitive tests, but not a domain-specific knowledge test when predicting skill group membership. Implications for theory and training, and future directions for the development of applied tools are discussed.
Resumo:
Altough nowadays DMTA is one of the most used techniques to characterize polymers thermo-mechanical behaviour, it is only effective for small amplitude oscillatory tests and limited to a single frequency analysis (linear regime). In this thesis work a Fourier transform based experimental system has proven to give hint on structural and chemical changes in specimens during large amplitude oscillatory tests exploiting multi frequency spectral analysis turning out in a more sensitive tool than classical linear approach. The test campaign has been focused on three test typologies: Strain sweep tests, Damage investigation and temperature sweep tests.
Resumo:
The following text is a business plan for SurfScholar - an advising service to attract international students to get entire academic degrees in Portugal’s most dynamic Surf-City: Lisbon. SurfScholar is presented as a viable business concept with environmental, economic and socio-cultural sustainability as guiding principles. For a variety of reasons, Lisbon is becoming an increasingly popular destination for international student mobility. Additionally, Lisbon and its surrounding coastal areas have been experiencing a recent boom in surf related tourism. The text goes into detail about how SurfScholar combines educational tourism and surf tourism by promoting Lisbon as the perfect destination to be both an international student and a surf tourist. To test the market interest in this concept, a simple website was created with a call-toaction. With minimal marketing, SurfScholar received a robust amount of interest from people around the world. SurfScholar’s mission to is to be at the forefront of linking educational tourism and surf tourism and to explore Portugal’s potential as the premier global destination for this new niche segment of tourism. SurfScholar’s business plan is formatted in accordance with the United States Agency for International Development’s, Sustainable Tourism Enterprise Development: A Business Planning Approach, (Humke & Hilbrunner, n.d).
Resumo:
Braille is a communication tool in decline, in America by 80% since 1950, and in the UK to the extent that only 1% of blind people are now thought to read Braille.1, 2 There are a variety of causal factors, including the phasing out of Braille instruction due to the educational mainstreaming of blind children and the resistance to learning Braille by those who lose sight later in life.3Braille is a writing system of raised dots that allows blind people to read and write tactilely. Each Braille character comprises a cell of six potentially raised dots, two dots across and three dots down. It is designed only to communicate the message and does not convey the tonality provided by visual fonts.However, in his book Design Meets Disability, Graham Pullin, observes that: “Braille is interesting and beautiful, as abstract visual and tactile decoration, intriguing and indecipherable to the nonreader ” and continues; “…braille could be decorative for sighted people.”4I assert that the increasing abandonment of Braille frees it from its restrictive constraints, opening it to exploration and experimentation, and that this may result in Braille becoming dynamic expression for the sighted, as well as the partially sighted and blind.Printmaking is well suited for this exploration. Printmaking processes and techniques can result in prints aesthetically compelling to both senses of sight and touch. Established approaches, such as flocking, varnishes, puff-ink, embossing and die cut, combined with experiments in new techniques in laser cutting and 3D printing, create visually and texturally vibrant prints.In this paper I will detail my systematic investigation of sensually expressive printmaking concentrating on the issues surrounding Braille as a printmaking design element paying particular attention to the approaches and techniques used not only in producing its visual style but to those techniques used to keep it integrally tactile.
Resumo:
Safe collaboration between a robot and human operator forms a critical requirement for deploying a robotic system into a manufacturing and testing environment. In this dissertation, the safety requirement for is developed and implemented for the navigation system of the mobile manipulators. A methodology for human-robot co-existence through a 3d scene analysis is also investigated. The proposed approach exploits the advance in computing capability by relying on graphic processing units (GPU’s) for volumetric predictive human-robot contact checking. Apart from guaranteeing safety of operators, human-robot collaboration is also fundamental when cooperative activities are required, as in appliance test automation floor. To achieve this, a generalized hierarchical task controller scheme for collision avoidance is developed. This allows the robotic arm to safely approach and inspect the interior of the appliance without collision during the testing procedure. The unpredictable presence of the operators also forms dynamic obstacle that changes very fast, thereby requiring a quick reaction from the robot side. In this aspect, a GPU-accelarated distance field is computed to speed up reaction time to avoid collision between human operator and the robot. An automated appliance testing also involves robotized laundry loading and unloading during life cycle testing. This task involves Laundry detection, grasp pose estimation and manipulation in a container, inside the drum and during recovery grasping. A wrinkle and blob detection algorithms for grasp pose estimation are developed and grasp poses are calculated along the wrinkle and blobs to efficiently perform grasping task. By ranking the estimated laundry grasp poses according to a predefined cost function, the robotic arm attempt to grasp poses that are more comfortable from the robot kinematic side as well as collision free on the appliance side. This is achieved through appliance detection and full-model registration and collision free trajectory execution using online collision avoidance.