972 resultados para Load tests


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the engineering reinforcement of-rock and soil mass, engineers must consider how to obtain better reinforcing effect at the cost of less reinforcing expense, which, in fact, is the aim of reinforcement design. In order to accomplish the purpose, they require not only researching the material used to reinforce and its structure, but also taking into account of several important geological factors, such as the structure and property of rock and soil mass. How to improve the reinforcing effect according to engineering geomechanical principle at the respect of the reinforcement of engineering soil and rock mass is studied and discussed in this paper. The author studies the theory, technology and practice of geotechnical reinforcement based on engineering geomechanics, taking example for the soil treatment of Zhengzhou Airport, the effect analysis of reinforcement to the slope on the left bank of Wuqiangxi Hydropower Station and the reinforcing design of the No. 102 Landslide and unique sand-slide slope on the Sichuan-Tibet Highway. The paper is comprised of two parts for the convenience of discussion. In the first part, from the first chapter to the fifth chapter, trying to perform the relevant research and application at the viewpoint of soil mass engineering geomechanics, the author mainly discusses the study of reinforcing soft ground soil through dynamical consolidation and its application. Then, in the second part, from the sixth chapter to the eleventh chapter, the study of new technologies in the rock slope reinforcement and their application are discussed. The author finds that not only better reinforcing effect can be gained in the research where the principle and method of rock mass engineering geomechanics is adopted, but also new reinforcing technologies can be put forward. Zhengzhou Airport is an important one in central plains. It lies on Yellow River alluvial deposit and the structure of stratum is complex and heterogeneous. The area of airport is very large, which can result in differential settlement easily, damage of airport and aircraft accident, whereas, there are no similar experiences to dispose the foundation, so the foundation treatment become a principal problem. During the process of treatment, the method of dynamic compaction was adopted after compared with other methods using the theory of synthetic integration. Dynamic compaction is an important method to consolidate foundation, which was successfully used in the foundation of Zhengzhou Airport. For fill foundation, controlling the thickness of fill so as to make the foundation treatment can reach the design demand and optimum thickness of the fill is a difficult problem. Considering this problem, the author proposed a calculation method to evaluate the thickness of fill. The method can consider not only the self-settlement of fill but also the settlement of the ground surface under applied load so as to ensure the settlement occurred during the using period can satisfy the design demand. It is proved that the method is correct after using it to choose reasonable energy of dynamic compaction to treat foundation. At the same time, in order to examine the effect of dynamic compaction, many monitor methods were adopted in the test such as static loading test, modulus of resilience test, deep pore pressure -test, static cone penetration test and the variation of the pore volume measurement. Through the tests, the author summarized the discipline of the accumulation and dissipation of pore pressure in Yellow River alluvial deposit under the action of dynamic compaction, gave a correct division of the property change of silt and clay under dynamic compaction, determined the bearing capacity of foundation after treatment and weighted the reinforcing effect of dynamic consolidation from the variation of the soil particle in microcosmic and the parameter of soil mass' density. It can be considered that the compactness of soil is in proportion to the energy of dynamic compaction. This conclusion provided a reference to the research of the "Problem of Soil Structure-the Central Problem of Soil Mechanics in 21 Century ". It is also important to strengthen rock mass for water conservancy and electric power engineering. Slip-resistance pile and anchoring adit full of reinforced concrete are usually adopted in engineering experience to strengthen rock mass and very important for engineering. But there also some deficiency such as the weakest section can't be highlighted, the monitor is inconvenient and the diameter of pile and adit is very large etc. The author and his supervisor professor Yangzhifa invented prestressed slip-resistance pile and prestressed anchoring adit full of reinforced concrete, utilizing the advantage that the prestressed structure has better anti-tensile characteristic (this invention is to be published). These inventions overcome the disadvantages of general slip-resistance pile and anchoring adit full of reinforced concrete and have the functions of engineering prospecting, strengthening, drainage and monitor simultaneous, so they have better strengthened effect and be more convenient for monitor and more economical than traditional methods. Drainage is an important factor in treatments of rock mass and slop. In view of the traditional drainage method that drainage pore often be clogged so as to resulted in incident, professor Yangzhifa invented the method and setting of guide penetration by fiber bundle. It would take good effect to use it in prestressed slip-resistance pile and anchoring adit full of reinforced concrete. In this paper, the author took example for anchoring adit full of reinforced concrete used to strengthen Wuqiangxi left bank to simulate the strengthened effect after consolidated by prestressed slip-resistance pile, took example for 102 landslide occurred along Sichuan-Tibet highway to simulate the application of slip-resistance pile and the new technology of drainage. At the same time the author proposed the treatment method of flowing sand in Sichuan-Tibet highway, which will benefit the study on strengthening similar engineering. There are five novelties in the paper with the author's theoretical study and engineering practice: 1. Summarizing the role of pore water pressure accumulation and dissipation of the Yellow River alluvial and diluvial soil under the action of dynamical consolidation, which has instructive significance in the engineering construction under the analogical engineering geological conditions in the future. It has not been researched by the predecessors. 2. Putting forward the concept of density D in microcosmic based on the microcosmical structure study of the soil sample. Adopting D to weight the reinforcing effect of dynamic consolidation is considered to be appropriate by the means of comparing the D values of Zhengzhou Airport's ground soil before with after dynamically consolidating reinforcement, so a more convenient balancing method can be provided for engineering practice. 3. According to the deep research into the soil mass engineering geology, engineering rock and soil science, soil mechanics, as well as considerable field experiments, improving the consolidating method in airport construction, from the conventional method, which is dynamically compactmg original ground surface firstly, then filling soil and dynamically layer-consolidating or layer-compacting at last to the upgraded method, which is performing dynamical consolidation after filling soil to place totally at the extent of the certain earth-filling depth. The result of the dynamical consolidation not only complies with the specifications, but also reduces the soil treatment investment by 10 million RMB. 4. Proposing the method for calculating the height of the filled soil by the means of estimating the potential displacement produced in the original ground surface and the filled earth soil under the possible load, selecting the appropriate dynamically-compacting power and determining the virtual height of the filled earth soil. The method is proved to be effective and scientific. 5. According to the thought of Engineering Geomechanics Metal-Synthetic Methodology (EGMS), patenting two inventions (to the stage of roclamation, with Professor Yang Zhi-fa, the cooperative tutor, and etc.) in which multi-functions, engineering geological investigation, reinforcement, drainage and strength remedy, are integrated all over in one body at the viewpoint of the breakage mechanism of the rock slope.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce and explore an approach to estimating statistical significance of classification accuracy, which is particularly useful in scientific applications of machine learning where high dimensionality of the data and the small number of training examples render most standard convergence bounds too loose to yield a meaningful guarantee of the generalization ability of the classifier. Instead, we estimate statistical significance of the observed classification accuracy, or the likelihood of observing such accuracy by chance due to spurious correlations of the high-dimensional data patterns with the class labels in the given training set. We adopt permutation testing, a non-parametric technique previously developed in classical statistics for hypothesis testing in the generative setting (i.e., comparing two probability distributions). We demonstrate the method on real examples from neuroimaging studies and DNA microarray analysis and suggest a theoretical analysis of the procedure that relates the asymptotic behavior of the test to the existing convergence bounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes two programs for generating tests for digital circuits that exploit several kinds of expert knowledge not used by previous approaches. First, many test generation problems can be solved efficiently using operation relations, a novel representation of circuit behavior that connects internal component operations with directly executable circuit operations. Operation relations can be computed efficiently by searching traces of simulated circuit behavior. Second, experts write test programs rather than test vectors because programs are more readable and compact. Test programs can be constructed automatically by merging program fragments using expert-supplied goal-refinement rules and domain-independent planning techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In practice, piles are most often modelled as "Beams on Non-Linear Winkler Foundation" (also known as “p-y spring” approach) where the soil is idealised as p-y springs. These p-y springs are obtained through semi-empirical approach using element test results of the soil. For liquefied soil, a reduction factor (often termed as p-multiplier approach) is applied on a standard p-y curve for the non-liquefied condition to obtain the p-y curve liquefied soil condition. This paper presents a methodology to obtain p-y curves for liquefied soil based on element testing of liquefied soil considering physically plausible mechanisms. Validation of the proposed p-y curves is carried out through the back analysis of physical model tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speculative service implies that a client's request for a document is serviced by sending, in addition to the document requested, a number of other documents (or pointers thereto) that the server speculates will be requested by the client in the near future. This speculation is based on statistical information that the server maintains for each document it serves. The notion of speculative service is analogous to prefetching, which is used to improve cache performance in distributed/parallel shared memory systems, with the exception that servers (not clients) control when and what to prefetch. Using trace simulations based on the logs of our departmental HTTP server http://cs-www.bu.edu, we show that both server load and service time could be reduced considerably, if speculative service is used. This is above and beyond what is currently achievable using client-side caching [3] and server-side dissemination [2]. We identify a number of parameters that could be used to fine-tune the level of speculation performed by the server.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Load balancing is often used to ensure that nodes in a distributed systems are equally loaded. In this paper, we show that for real-time systems, load balancing is not desirable. In particular, we propose a new load-profiling strategy that allows the nodes of a distributed system to be unequally loaded. Using load profiling, the system attempts to distribute the load amongst its nodes so as to maximize the chances of finding a node that would satisfy the computational needs of incoming real-time tasks. To that end, we describe and evaluate a distributed load-profiling protocol for dynamically scheduling time-constrained tasks in a loosely-coupled distributed environment. When a task is submitted to a node, the scheduling software tries to schedule the task locally so as to meet its deadline. If that is not feasible, it tries to locate another node where this could be done with a high probability of success, while attempting to maintain an overall load profile for the system. Nodes in the system inform each other about their state using a combination of multicasting and gossiping. The performance of the proposed protocol is evaluated via simulation, and is contrasted to other dynamic scheduling protocols for real-time distributed systems. Based on our findings, we argue that keeping a diverse availability profile and using passive bidding (through gossiping) are both advantageous to distributed scheduling for real-time systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-speed networks, such as ATM networks, are expected to support diverse Quality of Service (QoS) constraints, including real-time QoS guarantees. Real-time QoS is required by many applications such as those that involve voice and video communication. To support such services, routing algorithms that allow applications to reserve the needed bandwidth over a Virtual Circuit (VC) have been proposed. Commonly, these bandwidth-reservation algorithms assign VCs to routes using the least-loaded concept, and thus result in balancing the load over the set of all candidate routes. In this paper, we show that for such reservation-based protocols|which allow for the exclusive use of a preset fraction of a resource's bandwidth for an extended period of time-load balancing is not desirable as it results in resource fragmentation, which adversely affects the likelihood of accepting new reservations. In particular, we show that load-balancing VC routing algorithms are not appropriate when the main objective of the routing protocol is to increase the probability of finding routes that satisfy incoming VC requests, as opposed to equalizing the bandwidth utilization along the various routes. We present an on-line VC routing scheme that is based on the concept of "load profiling", which allows a distribution of "available" bandwidth across a set of candidate routes to match the characteristics of incoming VC QoS requests. We show the effectiveness of our load-profiling approach when compared to traditional load-balancing and load-packing VC routing schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To support the diverse Quality of Service (QoS) requirements of real-time (e.g. audio/video) applications in integrated services networks, several routing algorithms that allow for the reservation of the needed bandwidth over a Virtual Circuit (VC) established on one of several candidate routes have been proposed. Traditionally, such routing is done using the least-loaded concept, and thus results in balancing the load across the set of candidate routes. In a recent study, we have established the inadequacy of this load balancing practice and proposed the use of load profiling as an alternative. Load profiling techniques allow the distribution of "available" bandwidth across a set of candidate routes to match the characteristics of incoming VC QoS requests. In this paper we thoroughly characterize the performance of VC routing using load profiling and contrast it to routing using load balancing and load packing. We do so both analytically and via extensive simulations of multi-class traffic routing in Virtual Path (VP) based networks. Our findings confirm that for routing guaranteed bandwidth flows in VP networks, load balancing is not desirable as it results in VP bandwidth fragmentation, which adversely affects the likelihood of accepting new VC requests. This fragmentation is more pronounced when the granularity of VC requests is large. Typically, this occurs when a common VC is established to carry the aggregate traffic flow of many high-bandwidth real-time sources. For VP-based networks, our simulation results show that our load-profiling VC routing scheme performs better or as well as the traditional load-balancing VC routing in terms of revenue under both skewed and uniform workloads. Furthermore, load-profiling routing improves routing fairness by proactively increasing the chances of admitting high-bandwidth connections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of task assignment in a distributed system (such as a distributed Web server) in which task sizes are drawn from a heavy-tailed distribution. Many task assignment algorithms are based on the heuristic that balancing the load at the server hosts will result in optimal performance. We show this conventional wisdom is less true when the task size distribution is heavy-tailed (as is the case for Web file sizes). We introduce a new task assignment policy, called Size Interval Task Assignment with Variable Load (SITA-V). SITA-V purposely operates the server hosts at different loads, and directs smaller tasks to the lighter-loaded hosts. The result is that SITA-V provably decreases the mean task slowdown by significant factors (up to 1000 or more) where the more heavy-tailed the workload, the greater the improvement factor. We evaluate the tradeoff between improvement in slowdown and increase in waiting time in a system using SITA-V, and show conditions under which SITA-V represents a particularly appealing policy. We conclude with a discussion of the use of SITA-V in a distributed Web server, and show that it is attractive because it has a simple implementation which requires no communication from the server hosts back to the task router.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed hash tables have recently become a useful building block for a variety of distributed applications. However, current schemes based upon consistent hashing require both considerable implementation complexity and substantial storage overhead to achieve desired load balancing goals. We argue in this paper that these goals can b e achieved more simply and more cost-effectively. First, we suggest the direct application of the "power of two choices" paradigm, whereby an item is stored at the less loaded of two (or more) random alternatives. We then consider how associating a small constant number of hash values with a key can naturally b e extended to support other load balancing methods, including load-stealing or load-shedding schemes, as well as providing natural fault-tolerance mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose and evaluate an implementation of a prototype scalable web server. The prototype consists of a load-balanced cluster of hosts that collectively accept and service TCP connections. The host IP addresses are advertised using the Round Robin DNS technique, allowing any host to receive requests from any client. Once a client attempts to establish a TCP connection with one of the hosts, a decision is made as to whether or not the connection should be redirected to a different host---namely, the host with the lowest number of established connections. We use the low-overhead Distributed Packet Rewriting (DPR) technique to redirect TCP connections. In our prototype, each host keeps information about connections in hash tables and linked lists. Every time a packet arrives, it is examined to see if it has to be redirected or not. Load information is maintained using periodic broadcasts amongst the cluster hosts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper shows how a minimal neural network model of the cerebellum may be embedded within a sensory-neuro-muscular control system that mimics known anatomy and physiology. With this embedding, cerebellar learning promotes load compensation while also allowing both coactivation and reciprocal inhibition of sets of antagonist muscles. In particular, we show how synaptic long term depression guided by feedback from muscle stretch receptors can lead to trans-cerebellar gain changes that are load-compensating. It is argued that the same processes help to adaptively discover multi-joint synergies. Simulations of rapid single joint rotations under load illustrates design feasibility and stability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lidar is an optical remote sensing instrument that can measure atmospheric parameters. A Raman lidar instrument (UCLID) was established at University College Cork to contribute to the European lidar network, EARLINET. System performance tests were carried out to ensure strict data quality assurance for submission to the EARLINET database. Procedures include: overlap correction, telecover test, Rayleigh test and zero bin test. Raman backscatter coefficients, extinction coefficients and lidar ratio were measured from April 2010 to May 2011 and February 2012 to June 2012. Statistical analysis of the profiles over these periods provided new information about the typical atmospheric scenarios over Southern Ireland in terms of aerosol load in the lower troposphere, the planetary boundary layer (PBL) height, aerosol optical density (AOD) at 532 nm and lidar ratio values. The arithmetic average of the PBL height was found to be 608 ± 138 m with a median of 615 m, while average AOD at 532 nm for clean marine air masses was 0.119 ± 0.023 and for polluted air masses was 0.170 ± 0.036. The lidar ratio showed a seasonal dependence with lower values found in winter and autumn (20 ± 5 sr) and higher during spring and winter (30 ± 12 sr). Detection of volcanic particles from the eruption of the volcano Eyjafjallajökull in Iceland was measured between 21 April and 7 May 2010. The backscatter coefficient of the ash layer varied between 2.5 Mm-1sr-1 and 3.5 Mm-1sr-1, and estimation of the AOD at 532 nm was found to be between 0.090 and 0.215. Several aerosol loads due to Saharan dust particles were detected in Spring 2011 and 2012. Lidar ratio of the dust layers were determine to be between 45 and 77 sr and AOD at 532 nm during the dust events range between 0.84 to 0.494.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Serologic methods have been used widely to test for celiac disease and have gained importance in diagnostic definition and in new epidemiologic findings. However, there is no standardization, and there are no reference protocols and materials. METHODS: The European working group on Serological Screening for Celiac Disease has defined robust noncommercial test protocols for immunoglobulin (Ig)G and IgA gliadin antibodies and for IgA autoantibodies against endomysium and tissue transglutaminase. Standard curves were linear in the decisive range, and intra-assay variation coefficients were less than 5% to 10%. Calibration was performed with a group reference serum. Joint cutoff limits were used. Seven laboratories took part in the final collaborative study on 252 randomized sera classified by histology (103 pediatric and adult patients with active celiac disease, 89 disease control subjects, and 60 blood donors). RESULTS: IgA autoantibodies against endomysium and tissue transglutaminase rendered superior sensitivity (90% and 93%, respectively) and specificity (99% and 95%, respectively) over IgA and IgG gliadin antibodies. Tissue transglutaminase antibody testing showed superior receiver operating characteristic performance compared with gliadin antibodies. The K values for interlaboratory reproducibility showed superiority for IgA endomysium (0.93) in comparison with tissue transglutaminase antibodies (0.83) and gliadin antibodies (0.82 for IgG, 0.62 for IgA). CONCLUSIONS: Basic criteria of standardization and quality assessment must be fulfilled by any given test protocol proposed for serologic investigation of celiac disease. The working group has produced robust test protocols and reference materials available for standardization to further improve reliability of serologic testing for celiac disease.