917 resultados para High dynamic vehicles


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis makes use of the unique reregulation of pharmaceutical monopoly in Sweden to critically examine intraindustry firm heterogeneity. It contributes to existing divestiture research as it studies the dynamism in between reconfigurations of value constellations and its effects on value creation of divested pharmacies. Because the findings showed that the predominant theory of intraindustry firm heterogeneity could not explain firm performance, the value constellation concept was applied as it captured the phenomena. A patterned finding informed how reconfigurations of value constellations in a reregulated market characterized by strict rules, regulations, and high competition did not generate additional value for firms on short term. My study unveils that value creation is hampered in situations where rules and regulations significantly affect firms’ ability to reconfigure their value constellations. The key practical implication is an alternative perspective on fundamental aspects of the reregulation and how policy-makers may impede firm performance and the intended creation of new value for not only firms but for society as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Policyclyc aromatic hydrocarbons (PAHs) are a potential risk for human health and marine biota in general that make necessary the monitorization of them. A miniaturized extraction system capable to extract PAHs from seawater was developed and optimized with the objective of implement it in an oceanographic buoy in the future. An analytical method was optimized by high performance liquid chromatography for the determination of extracted PAHs by the extraction system. The analytical method was validated and applicated to real samples of differents points of Gran Canaria. The method has enough sensitivity to detect and quantify concentrations below the concentrations established in the legislation. In some places where samples were taken some compounds exceed the legislation while other compounds follow it

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study had three objectives: (1) to develop a comprehensive truck simulation that executes rapidly, has a modular program construction to allow variation of vehicle characteristics, and is able to realistically predict vehicle motion and the tire-road surface interaction forces; (2) to develop a model of doweled portland cement concrete pavement that can be used to determine slab deflection and stress at predetermined nodes, and that allows for the variation of traditional thickness design factors; and (3) to implement these two models on a work station with suitable menu driven modules so that both existing and proposed pavements can be evaluated with respect to design life, given specific characteristics of the heavy vehicles that will be using the facility. This report summarizes the work that has been performed during the first year of the study. Briefly, the following has been accomplished: A two dimensional model of a typical 3-S2 tractor-trailer combination was created. A finite element structural analysis program, ANSYS, was used to model the pavement. Computer runs have been performed varying the parameters defining both vehicle and road elements. The resulting time specific displacements for each node are plotted, and the displacement basin is generated for defined vehicles. Relative damage to the pavement can then be estimated. A damage function resulting from load replications must be assumed that will be reflected by further pavement deterioration. Comparison with actual damage on Interstate 80 will eventually allow verification of these procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the casting of reactive metals, such as titanium alloys, contamination can be prevented if there is no contact between the hot liquid metal and solid crucible. This can be achieved by containing the liquid metal by means of high frequency AC magnetic field. A water cooled current-carrying coil, surrounding the metal can then provide the required Lorentz forces, and at the same time the current induced in the metal can provide the heating required to melt it. This ‘attractive’ processing solution has however many problems, the most serious being that of the control and containment of the liquid metal envelope, which requires a balance of the gravity and induced inertia forces on the one side, and the containing Lorentz and surface tension forces on the other. To model this process requires a fully coupled dyna ic solution of the flow fields, magnetic field and heat transfer/melding process to account for. A simplified solution has been published previously providing quasi-static solutions only, by taking the irrotational ‘magnetic pressure’ term of the Lorentz force into account. The authors remedy this deficiency by modelling the full problem using CFD techniques. The salient features of these techniques are included in this paper, as space allows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A large class of computational problems are characterised by frequent synchronisation, and computational requirements which change as a function of time. When such a problem is solved on a message passing multiprocessor machine [5], the combination of these characteristics leads to system performance which deteriorate in time. As the communication performance of parallel hardware steadily improves so load balance becomes a dominant factor in obtaining high parallel efficiency. Performance can be improved with periodic redistribution of computational load; however, redistribution can sometimes be very costly. We study the issue of deciding when to invoke a global load re-balancing mechanism. Such a decision policy must actively weigh the costs of remapping against the performance benefits, and should be general enough to apply automatically to a wide range of computations. This paper discusses a generic strategy for Dynamic Load Balancing (DLB) in unstructured mesh computational mechanics applications. The strategy is intended to handle varying levels of load changes throughout the run. The major issues involved in a generic dynamic load balancing scheme will be investigated together with techniques to automate the implementation of a dynamic load balancing mechanism within the Computer Aided Parallelisation Tools (CAPTools) environment, which is a semi-automatic tool for parallelisation of mesh based FORTRAN codes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many areas of simulation, a crucial component for efficient numerical computations is the use of solution-driven adaptive features: locally adapted meshing or re-meshing; dynamically changing computational tasks. The full advantages of high performance computing (HPC) technology will thus only be able to be exploited when efficient parallel adaptive solvers can be realised. The resulting requirement for HPC software is for dynamic load balancing, which for many mesh-based applications means dynamic mesh re-partitioning. The DRAMA project has been initiated to address this issue, with a particular focus being the requirements of industrial Finite Element codes, but codes using Finite Volume formulations will also be able to make use of the project results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new dynamic load balancing technique for structured mesh computational mechanics codes in which the processor partition range limits of just one of the partitioned dimensions uses non-coincidental limits, as opposed to using coincidental limits in all of the partitioned dimensions. The partition range limits are 'staggered', allowing greater flexibility in obtaining a balanced load distribution in comparison to when the limits are changed 'globally'. as the load increase/decrease on one processor no longer restricts the load decrease/increase on a neighbouring processor. The automatic implementation of this 'staggered' load balancing strategy within an existing parallel code is presented in this paper, along with some preliminary results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A rapidly changing business environment has necessitated most small and medium sized enterprises with international ambitions to reconsider their sources of competitive advantage. To survive in the face of a changing business environment, firms should utilize their dynamic organizational capabilities as well as their internationalization capabilities. Firms develop a competitive advantage if they can exploit their unique organizational competences in a new or foreign market and also if they can acquire new capabilities as a result of engaging in foreign markets. The acquired capabilities from foreign locations enhance the existing capability portfolio of a firm with a desire to internationalize. The study combined the research streams of SME organizational dynamic capability and internationalization capability to build a complete picture on the existing knowledge. An intensive case study was used for empirically testing the theoretical framework of the study and compared with the literature on various organizational capability factors and internationalization capabilities. Sormay Oy was selected because it is a successful medium sized company operating in Finland in the manufacturing industry which has a high international profile. In addition, it has sufficient rate of growth in sales that warrants it to engage internationally in matters such as, acquisitions, joint ventures and partnerships. The key findings of the study suggests that, medium sized manufacturing firms have a set of core competences arising from their organizational capabilities which were identified to be employee know how and relationship with stakeholders which aid the firm in its quest for attaining competitive advantage, ensuring production flexibility and gaining benefits present in a network. In addition, internationalization capabilities were identified under both the RAT test and CAT test whereby the primary findings suggests that, firms that outperform their competitors produce products that meet specific customer and country requirements, foresee the pitfalls of imitation brought about by the foreign local companies and members of a particular network through joint ventures, acquisitions or partnerships as well as those firms that are capable to acquire new capabilities in the foreign markets and successfully use these acquired capabilities to enhance or renew their capability portfolio for their competitive advantage. Additional significant findings under internationalization capabilities were discovered whereby, Sormay Oy was able to develop a new market space for its products despite the difficult institutional environment present in Russia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power system engineers face a double challenge: to operate electric power systems within narrow stability and security margins, and to maintain high reliability. There is an acute need to better understand the dynamic nature of power systems in order to be prepared for critical situations as they arise. Innovative measurement tools, such as phasor measurement units, can capture not only the slow variation of the voltages and currents but also the underlying oscillations in a power system. Such dynamic data accessibility provides us a strong motivation and a useful tool to explore dynamic-data driven applications in power systems. To fulfill this goal, this dissertation focuses on the following three areas: Developing accurate dynamic load models and updating variable parameters based on the measurement data, applying advanced nonlinear filtering concepts and technologies to real-time identification of power system models, and addressing computational issues by implementing the balanced truncation method. By obtaining more realistic system models, together with timely updated parameters and stochastic influence consideration, we can have an accurate portrait of the ongoing phenomena in an electrical power system. Hence we can further improve state estimation, stability analysis and real-time operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Permeability of a rock is a dynamic property that varies spatially and temporally. Fractures provide the most efficient channels for fluid flow and thus directly contribute to the permeability of the system. Fractures usually form as a result of a combination of tectonic stresses, gravity (i.e. lithostatic pressure) and fluid pressures. High pressure gradients alone can cause fracturing, the process which is termed as hydrofracturing that can determine caprock (seal) stability or reservoir integrity. Fluids also transport mass and heat, and are responsible for the formation of veins by precipitating minerals within open fractures. Veining (healing) thus directly influences the rock’s permeability. Upon deformation these closed factures (veins) can refracture and the cycle starts again. This fracturing-healing-refacturing cycle is a fundamental part in studying the deformation dynamics and permeability evolution of rock systems. This is generally accompanied by fracture network characterization focusing on network topology that determines network connectivity. Fracture characterization allows to acquire quantitative and qualitative data on fractures and forms an important part of reservoir modeling. This thesis highlights the importance of fracture-healing and veins’ mechanical properties on the deformation dynamics. It shows that permeability varies spatially and temporally, and that healed systems (veined rocks) should not be treated as fractured systems (rocks without veins). Field observations also demonstrate the influence of contrasting mechanical properties, in addition to the complexities of vein microstructures that can form in low-porosity and permeability layered sequences. The thesis also presents graph theory as a characterization method to obtain statistical measures on evolving network connectivity. It also proposes what measures a good reservoir should have to exhibit potentially large permeability and robustness against healing. The results presented in the thesis can have applications for hydrocarbon and geothermal reservoir exploration, mining industry, underground waste disposal, CO2 injection or groundwater modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of my Ph. D. thesis is to generalize a method for targeted anti-cancer drug delivery. Hydrophilic polymer-drug conjugates involve complicated synthesis; drug-encapsulated polymeric nanoparticles limit the loading capability of payloads. This thesis introduces the concept of nanoconjugates to overcome difficulties in synthesis and formulation. Drugs with hydroxyl group are able to initiate polyester synthesis in a regio- and chemo- selective way, with the mediation of ligand-tunable Zinc catalyst. Herein, three anti-cancer drugs are presented to demonstrate the high efficiency and selectivity in the method (Chapter 2-4). The obtained particles are stable in salt solution, releasing drugs over weeks in controlled manner. With the conjugation of aptamer, particles are capable to target prostate cancer cells in vitro. These results open the gateway to evaluate the in vivo efficacy of nanoconjugates for target cancer therapy (Chapter 5). Mechanism study of the polymerization leads to the discovery of chemosite selective synthesis of prodrugs with acrylate functional groups. Functional copolymer-drug conjugates will expand the scope of nanoconjugates (Chapter 6). Liposome-aptamer targeting drug delivery vehicle is well studied to achieve reversible cell-specific delivery of non-hydoxyl drugs e.g. cisplatin (Chapter 7). New monomers and polymerization mechanisms are explored for polyester in order to synthesize nanoconjugates with variety on properties (Chapter 8). Initial efforts to apply this type of prodrugs will be focused on the preparation of hydrogels for stem cell research (Chapter 9).