965 resultados para Window gardening.
Resumo:
Based on the theory of LURR and its recent development, spatial and temporal variation of Y/Y-c (value of LURR/critical value of LURR) in the Southern California region during the period from 1980 through March, 2001 was studied. According to the previous study on the fault system and stress field in Southern California, we zoned the Southern California region into 11 parts in each of which the stress field is almost uniform. With the time window of one year, time moving step of three months, space window of a circle region with a radius of 100 km and space moving step of 0.25 degree in latitude and longitude direction, the evolution of Y/Y-c were snapshot. The scanning results show that obvious Y/Y-c anomalies occurred before 5/6 of strong earthquakes considered with a magnitude of 6.5 or greater. The critical regions of Y/Y-c are near the epicenters of the strong earthquakes and the Y/Y-c anomalies occur months to years prior to the earthquakes. The tendency of earthquake occurrence in the California region is briefly discussed on the basis of the examination of Y/Y-c.
Resumo:
Executive Summary: This study describes the socio-economic characteristics of the U.S. Caribbean trap fishery that encompasses the Commonwealth of Puerto Rico and Territory of the U.S. Virgin Islands. In-person interviews were administered to one hundred randomly selected trap fishermen, constituting nearly 25% of the estimated population. The sample was stratified by geographic area and trap tier. The number of traps owned or fished to qualify for a given tier varied by island. In Puerto Rico, tier I consisted of fishermen who had between 1-40 fish traps, tier II was made up of fishermen who possessed between 41 and 100 fish traps, and tier III consisted of fishermen who held in excess of 100 fish traps. In St. Thomas and St. John, tier I was composed of fishermen who held between 1 and 50 fish traps, tier II consisted of fishermen who had between 51-150 fish traps and tier III was made up of fishermen who had in excess of 150 fish traps. Lastly, in St. Croix, tier I was made up of fishermen who had less than 20 fish traps and tier II consisted of fishermen who had 20 or more fish traps. The survey elicited information on household demographics, annual catch and revenue, trap usage, capital investment on vessels and equipment, fixed and variable costs, behavioral response to a hypothetical trap reduction program and the spatial distribution of traps. The study found that 79% of the sampled population was 40 years or older. The typical Crucian trap fisherman was older than their Puerto Rican and St. Thomian and St. Johnian counterparts. Crucian fishermen’s average age was 57 years whereas Puerto Rican fishermen’s average age was 51 years, and St. Thomian and St. Johnian fishermen’s average age was 48 years. As a group, St. Thomian and St. Johnian fishermen had 25 years of fishing experience, and Puerto Rican and Crucian fishermen had 30, and 29 years, respectively. Overall, 90% of the households had at least one dependent. The average number of dependents across islands was even, ranging between 2.8 in the district of St. Thomas and St. John and 3.4 in the district of St. Croix. The percentage utilization of catch for personal or family use was relatively low. Regionally, percentage use of catch for personal or family uses ranged from 2.5% in St. Croix to 3.8% in the St. Thomas and St. John. About 47% of the respondents had a high school degree. The majority of the respondents were highly dependent on commercial fishing for their household income. In St. Croix, commercial fishing made up 83% of the fishermen’s total household income, whereas in St. Thomas and St. John and Puerto Rico it contributed 74% and 68%, respectively. The contribution of fish traps to commercial fishing income ranged from 51% in the lowest trap tier in St. Thomas and St. John to 99% in the highest trap tier in St. Croix. On an island basis, the contribution of fish traps to fishing income was 75% in St. Croix, 61% in St. Thomas and St. John, and 59% in Puerto Rico. The value of fully rigged vessels ranged from $400 to $250,000. Over half of the fleet was worth $10,000 or less. The St. Thomas and St. John fleet reported the highest mean value, averaging $58,518. The Crucian and Puerto Rican fleets were considerably less valuable, averaging $19,831 and $8,652, respectively. The length of the vessels ranged from 14 to 40 feet. Fifty-nine percent of the sampled vessels were at least 23 feet in length. The average length of the St. Thomas and St. John fleet was 28 feet, whereas the fleets based in St. Croix and Puerto Rico averaged 21 feet. The engine’s propulsion ranged from 8 to 400 horsepower (hp). The mean engine power was 208 hp in St. Thomas and St. John, 108 hp in St. Croix, and 77 hp in Puerto Rico. Mechanical trap haulers and depth recorders were the most commonly used on-board equipment. About 55% of the sampled population reported owning mechanical trap haulers. In St. Thomas and St. John, 100% of the respondents had trap haulers compared to 52% in Puerto Rico and 20% in St. Croix. Forty-seven percent of the fishermen surveyed stated having depth recorders. Depth recorders were most common in the St. Thomas and St. John fleet (80%) and least common in the Puerto Rican fleet (37%). The limited presence of emergency position indication radio beacons (EPIRBS) and radar was the norm among the fish trap fleet. Only 8% of the respondents had EPIRBS and only 1% had radar. Interviewees stated that they fished between 1 and 350 fish traps. Puerto Rican respondents fished on average 39 fish traps, in contrast to St. Thomian and St. Johnian and Crucian respondents, who fished 94 and 27 fish traps, respectively. On average, Puerto Rican respondents fished 11 lobster traps, and St. Thomian and St. Johnian respondents fished 46 lobster traps. None of the Crucian respondents fished lobster traps. The number of fish traps built or purchased ranged between 0 and 175, and the number of lobster traps built or bought ranged between 0 and 200. Puerto Rican fishermen on average built or purchased 30 fish traps and 14 lobster traps, and St. Thomian and St. Johnian fishermen built or bought 30 fish traps and 11 lobster traps. Crucian fishermen built or bought 25 fish traps and no lobster traps. As a group, fish trap average life ranged between 1.3 and 5 years, and lobster traps lasted slightly longer, between 1.5 and 6 years. The study found that the chevron or arrowhead style was the most common trap design. Puerto Rican fishermen owned an average of 20 arrowhead traps. St. Thomian and St. Johnian and Crucian fishermen owned an average of 44 and 15 arrowhead fish traps, respectively. The second most popular trap design was the square trap style. Puerto Rican fishermen had an average of 9 square traps, whereas St. Thomian and St. Johnian fishermen had 33 traps and Crucian fishermen had 2 traps. Antillean Z (or S) -traps, rectangular and star traps were also used. Although Z (or S) -traps are considered the most productive trap design, fishermen prefer the smaller-sized arrowhead and square traps because they are easier and less expensive to build, and larger numbers of them can be safely deployed. The cost of a fish trap, complete with rope and buoys, varied significantly due to the wide range of construction materials utilized. On average, arrowhead traps commanded $94 in Puerto Rico, $251 in St. Thomas and St. John, and $119 in St. Croix. The number of trips per week ranged between 1 and 6. However, 72% of the respondents mentioned that they took two trips per week. On average, Puerto Rican fishermen took 2.1 trips per week, St. Thomian and St. Johnian fishermen took 1.4 trips per week, and Crucian fishermen took 2.5 trips per week. Most fishing trips started at dawn and finished early in the afternoon. Over 82% of the trips lasted 8 hours or less. On average, Puerto Rican fishermen hauled 27 fish traps per trip whereas St. Thomian and St. Johnian fishermen and Crucian fishermen hauled 68 and 26 fish traps per trip, respectively. The number of traps per string and soak time varied considerably across islands. In St. Croix, 84% of the respondents had a single trap per line, whereas in St. Thomas and St. John only 10% of the respondents had a single trap per line. Approximately, 43% of Puerto Rican fishermen used a single trap line. St. Thomian and St. Johnian fishermen soaked their traps for 6.9 days while Puerto Rican and Crucian fishermen soaked their traps for 5.7 and 3.6 days, respectively. The heterogeneity of the industry was also evidenced by the various economic surpluses generated. The survey illustrated that higher gross revenues did not necessarily translate into higher net revenues. Our analysis also showed that, on average, vessels in the trap fishery were able to cover their cash outlays, resulting in positive vessel income (i.e., financial profits). In Puerto Rico, annual financial profits ranged from $4,760 in the lowest trap tier to $32,467 in the highest tier, whereas in St. Thomas and St. John annual financial profits ranged from $3,744 in the lowest tier to $13,652 in the highest tier. In St. Croix, annual financial profits ranged between $9,229 and $15,781. The survey also showed that economic profits varied significantly across tiers. Economic profits measure residual income after deducting the remuneration required to keep the various factors of production in their existing employment. In Puerto Rico, annual economic profits ranged from ($9,339) in the lowest trap tier to $ 8,711 in the highest trap tier. In St. Thomas and St. John, annual economic profits ranged from ($7,920) in the highest tier to ($18,486) in the second highest tier. In St. Croix, annual economic profits ranged between ($7,453) to $10,674. The presence of positive financial profits and negative economic profits suggests that higher economic returns could be earned from a societal perspective by redirecting some of these scarce capital and human resources elsewhere in the economy. Furthermore, the presence of negative economic earnings is evidence that the fishery is overcapitalized and that steps need to be taken to ensure the long-run economic viability of the industry. The presence of positive financial returns provides managers with a window of opportunity to adopt policies that will strengthen the biological and economic performance of the fishery while minimizing any adverse impacts on local fishing communities. Finally, the document concludes by detailing how the costs and earnings information could be used to develop economic models that evaluate management proposals. (PDF contains 147 pages)
Resumo:
A primary objective of the Common Fishery Policy of the European Union is the reduction of discards and unwanted by-catches in the fishery. In principle this could be achieved if the catching methods were optimised for this. Still high numbers of undersized flatfish are caught in the bottom trawls. Although EU regulations make the use of the BACOMA codend mandatory in the Baltic Sea cod areable to escape through square mesh escape window of the BACOMA net the whereas flatfish still remain in the cod-end. Gear experiments have been carried out with the aim to better separate cod from the flatfish fraction already when entering the rear belly, making use of the natural behaviour of the fish, i. e. the preferred swimming distance from the bottom of the net in the funnel. As cod have a natural tendency to keep a relativly great distance from the bottom, flatfish tend to stay close to it. It was attempted to separate both fractions by splitting the funnel into an upper and lower part with a horizontal panel. This wastested for two different nets, a cod trawl to separate cod from flatfish, and an eel-trawl to separate cod and flatfish from eel. Cod and flatfish separation is best at a panel distance of 50 cm from the bottom. Thus, 74 % of the cod were found in the upper panel, whereas 75 % of the flounder were in the lower section. A separation of eel from cod was however not possible, since eel tend to rise to the upper part of the net, together with cod.
Resumo:
To improve the cod stocks in the Baltic Sea, a number of regulations have recently been established by the International Baltic Sea Fisheries Commission (IBSFC) and the European Commission. According to these, fishermen are obliged to use nets with escape windows (BACOMA nets) with a mesh size of the escape window of 120 mm until end of September 2003. These nets however, retain only fish much larger than the legal minimum landing size would al-low. Due to the present stock structure only few of such large fish are however existent. As a consequence fishermen use a legal alternative net. This is a conventional trawl with a cod-end of 130 mm diamond-shaped meshes (IBSFC-rules of 1st April 2002), to be increased to 140 mm on 1st September 2003, according to the mentioned IBSFC-rule. Due legal alterations of the net by the fishermen (e.g. use of extra stiff net material) these nets have acquired extremely low selective properties, i. e. they catch very small fish and produce great amounts of discards. Due to the increase of the minimum landing size from 35 to 38 cm for cod in the Baltic, the amount of discards has even increased since the beginning of 2003. Experiments have now been carried out with the BACOMAnet on German and Swedish commercial and research vessels since arguments were brought forward that the BACOMA net was not yet sufficiently tested on commercial vessels. The results of all experiments conducted so far, are compiled and evaluated here. As a result of the Swedish, Danish and German initiative and research the European Commission reacted upon this in June 2003 and rejected the increase of the diamond-meshed non-BACOMA net from 130 mm to 140mm in September 2003. To protect the cod stocks in the Baltic Sea more effectively the use of traditional diamond meshed cod-ends with-out escape window are prohibited in community waters without derogation, becoming effective 1st of September 2003. To enable more effective and simplified control of the bottom trawl fishery in the Baltic Sea the principle of a ”One-Net-Rule“ is enforced. This is going to be the BACOMA net, with the meshes of the escape window being 110 mm for the time being. The description of the BACOMA net as given in the IBSFC-rules no.10 (revision of the 28th session, Berlin 2002) concentrates on the cod-end and the escape window but only to a less extent on the design and mesh-composition of the remaining parts of the net, such as belly and funnel and many details. Thus, the present description is not complete and leaves, according to fishermen, ample opportunity for manipulation. An initiative has been started in Germany with joint effort from scientists and the fishery to better describe the entire net and to produce a proposal for a more comprehensive description, leaving less space for manipulation. A proposal in this direction is given here and shall be seen as a starting point for a discussion and development towards an internationally uniform net, which is agreed amongst the fishery, scientists and politicians. The Baltic Sea fishery is invited to comment on this proposal, and recommendations for further improvement and specifications are welcomed. Once the design is agreed by the Baltic Fishermen Association, it shall be proposed to the IBSFC and European Commission via the Baltic Fishermen Association.
Resumo:
Thermal fluctuation approach is widely used to monitor association kinetics of surface-bound receptor-ligand interactions. Various protocols such as sliding standard deviation (SD) analysis (SSA) and Page's test analysis (PTA) have been used to estimate two-dimensional (2D) kinetic rates from the time course of displacement of molecular carrier. In the current work, we compared the estimations from both SSA and modified PTA using measured data from an optical trap assay and simulated data from a random number generator. Our results indicated that both SSA and PTA were reliable in estimating 2D kinetic rates. Parametric analysis also demonstrated that such the estimations were sensitive to parameters such as sampling rate, sliding window size, and threshold. These results furthered the understandings in quantifying the biophysics of receptor-ligand interactions.
Resumo:
Gold Coast Water is responsible for the management of the water and wastewater assets of the City of the Gold Coast on Australia’s east coast. Treated wastewater is released at the Gold Coast Seaway on an outgoing tide in order for the plume to be dispersed before the tide changes and renters the Broadwater estuary. Rapid population growth over the past decade has placed increasing demands on the receiving waters for the release of the City’s effluent. The Seaway SmartRelease Project is designed to optimise the release of the effluent from the City’s main wastewater treatment plant in order to minimise the impact of the estuarine water quality and maximise the cost efficiency of pumping. In order to do this an optimisation study that involves water quality monitoring, numerical modelling and a web based decision support system was conducted. An intensive monitoring campaign provided information on water levels, currents, winds, waves, nutrients and bacterial levels within the Broadwater. These data were then used to calibrate and verify numerical models using the MIKE by DHI suite of software. The decision support system then collects continually measured data such as water levels, interacts with the WWTP SCADA system, runs the models in forecast mode and provides the optimal time window to release the required amount of effluent from the WWTP. The City’s increasing population means that the length of time available for releasing the water with minimal impact may be exceeded within 5 years. Optimising the release of the treated water through monitoring, modelling and a decision support system has been an effective way of demonstrating the limited environmental impact of the expected short term increase in effluent disposal procedures. (PDF contains 5 pages)
Resumo:
This thesis presents a biologically plausible model of an attentional mechanism for forming position- and scale-invariant representations of objects in the visual world. The model relies on a set of control neurons to dynamically modify the synaptic strengths of intra-cortical connections so that information from a windowed region of primary visual cortex (Vl) is selectively routed to higher cortical areas. Local spatial relationships (i.e., topography) within the attentional window are preserved as information is routed through the cortex, thus enabling attended objects to be represented in higher cortical areas within an object-centered reference frame that is position and scale invariant. The representation in V1 is modeled as a multiscale stack of sample nodes with progressively lower resolution at higher eccentricities. Large changes in the size of the attentional window are accomplished by switching between different levels of the multiscale stack, while positional shifts and small changes in scale are accomplished by translating and rescaling the window within a single level of the stack. The control signals for setting the position and size of the attentional window are hypothesized to originate from neurons in the pulvinar and in the deep layers of visual cortex. The dynamics of these control neurons are governed by simple differential equations that can be realized by neurobiologically plausible circuits. In pre-attentive mode, the control neurons receive their input from a low-level "saliency map" representing potentially interesting regions of a scene. During the pattern recognition phase, control neurons are driven by the interaction between top-down (memory) and bottom-up (retinal input) sources. The model respects key neurophysiological, neuroanatomical, and psychophysical data relating to attention, and it makes a variety of experimentally testable predictions.
Resumo:
A theoretical investigation is carried out into the effect of spontaneously generated coherence on the Kerr nonlinearity of general three-level systems of Lambda, ladder, and V-shape types. It is found, with spontaneously generated coherence present, that the Kerr nonlinearity can be clearly enhanced. In the Lambda- and ladder-type systems, the maximal Kerr nonlinearity increases and at the same time enters the electromagnetically induced transparency window as the spontaneously generated coherence intensifies. As for the V-type system, the absorption property is significantly modified and therefore enhanced Kerr nonlinearity without absorption occurs for certain probe detunings. We attribute the enhancement of Kerr nonlinearity mainly to the presence of an extra atomic coherence induced by the spontaneously generated coherence.
Resumo:
We analyse a four-wave mixing (FWM) scheme in a five-level atomic system in which double-dark resonances are present. It is found that the enhancement of FWM in both electromagnetically induced transparency (EIT) windows can be obtained even without the condition of multiphoton resonance. Moreover, the conversion efficiency of FWM in one EIT window can be much larger than that in the other due to the presence of interacting dark resonances.
Resumo:
This thesis explores the problem of mobile robot navigation in dense human crowds. We begin by considering a fundamental impediment to classical motion planning algorithms called the freezing robot problem: once the environment surpasses a certain level of complexity, the planner decides that all forward paths are unsafe, and the robot freezes in place (or performs unnecessary maneuvers) to avoid collisions. Since a feasible path typically exists, this behavior is suboptimal. Existing approaches have focused on reducing predictive uncertainty by employing higher fidelity individual dynamics models or heuristically limiting the individual predictive covariance to prevent overcautious navigation. We demonstrate that both the individual prediction and the individual predictive uncertainty have little to do with this undesirable navigation behavior. Additionally, we provide evidence that dynamic agents are able to navigate in dense crowds by engaging in joint collision avoidance, cooperatively making room to create feasible trajectories. We accordingly develop interacting Gaussian processes, a prediction density that captures cooperative collision avoidance, and a "multiple goal" extension that models the goal driven nature of human decision making. Navigation naturally emerges as a statistic of this distribution.
Most importantly, we empirically validate our models in the Chandler dining hall at Caltech during peak hours, and in the process, carry out the first extensive quantitative study of robot navigation in dense human crowds (collecting data on 488 runs). The multiple goal interacting Gaussian processes algorithm performs comparably with human teleoperators in crowd densities nearing 1 person/m2, while a state of the art noncooperative planner exhibits unsafe behavior more than 3 times as often as the multiple goal extension, and twice as often as the basic interacting Gaussian process approach. Furthermore, a reactive planner based on the widely used dynamic window approach proves insufficient for crowd densities above 0.55 people/m2. We also show that our noncooperative planner or our reactive planner capture the salient characteristics of nearly any dynamic navigation algorithm. For inclusive validation purposes, we show that either our non-interacting planner or our reactive planner captures the salient characteristics of nearly any existing dynamic navigation algorithm. Based on these experimental results and theoretical observations, we conclude that a cooperation model is critical for safe and efficient robot navigation in dense human crowds.
Finally, we produce a large database of ground truth pedestrian crowd data. We make this ground truth database publicly available for further scientific study of crowd prediction models, learning from demonstration algorithms, and human robot interaction models in general.
Resumo:
A five-level tripod scheme is proposed for obtaining a high efficiency four-wave-mixing (FWM) process. The existence of double-dark resonances leads to a strong modification of the absorption and dispersion properties against a pump wave at two transparency windows. We show that both of them can be used to open the four-wave mixing channel and produce efficient mixing waves. In particular, higher FWM efficiency is always produced at the transparent window corresponding to the relatively weak-coupling field. By manipulating the intensity of the two coupling fields, the conversion efficiency of FWM can be controlled.
Resumo:
The work presented in this thesis revolves around erasure correction coding, as applied to distributed data storage and real-time streaming communications.
First, we examine the problem of allocating a given storage budget over a set of nodes for maximum reliability. The objective is to find an allocation of the budget that maximizes the probability of successful recovery by a data collector accessing a random subset of the nodes. This optimization problem is challenging in general because of its combinatorial nature, despite its simple formulation. We study several variations of the problem, assuming different allocation models and access models, and determine the optimal allocation and the optimal symmetric allocation (in which all nonempty nodes store the same amount of data) for a variety of cases. Although the optimal allocation can have nonintuitive structure and can be difficult to find in general, our results suggest that, as a simple heuristic, reliable storage can be achieved by spreading the budget maximally over all nodes when the budget is large, and spreading it minimally over a few nodes when it is small. Coding would therefore be beneficial in the former case, while uncoded replication would suffice in the latter case.
Second, we study how distributed storage allocations affect the recovery delay in a mobile setting. Specifically, two recovery delay optimization problems are considered for a network of mobile storage nodes: the maximization of the probability of successful recovery by a given deadline, and the minimization of the expected recovery delay. We show that the first problem is closely related to the earlier allocation problem, and solve the second problem completely for the case of symmetric allocations. It turns out that the optimal allocations for the two problems can be quite different. In a simulation study, we evaluated the performance of a simple data dissemination and storage protocol for mobile delay-tolerant networks, and observed that the choice of allocation can have a significant impact on the recovery delay under a variety of scenarios.
Third, we consider a real-time streaming system where messages created at regular time intervals at a source are encoded for transmission to a receiver over a packet erasure link; the receiver must subsequently decode each message within a given delay from its creation time. For erasure models containing a limited number of erasures per coding window, per sliding window, and containing erasure bursts whose maximum length is sufficiently short or long, we show that a time-invariant intrasession code asymptotically achieves the maximum message size among all codes that allow decoding under all admissible erasure patterns. For the bursty erasure model, we also show that diagonally interleaved codes derived from specific systematic block codes are asymptotically optimal over all codes in certain cases. We also study an i.i.d. erasure model in which each transmitted packet is erased independently with the same probability; the objective is to maximize the decoding probability for a given message size. We derive an upper bound on the decoding probability for any time-invariant code, and show that the gap between this bound and the performance of a family of time-invariant intrasession codes is small when the message size and packet erasure probability are small. In a simulation study, these codes performed well against a family of random time-invariant convolutional codes under a number of scenarios.
Finally, we consider the joint problems of routing and caching for named data networking. We propose a backpressure-based policy that employs virtual interest packets to make routing and caching decisions. In a packet-level simulation, the proposed policy outperformed a basic protocol that combines shortest-path routing with least-recently-used (LRU) cache replacement.
Resumo:
Recent observations of the temperature anisotropies of the cosmic microwave background (CMB) favor an inflationary paradigm in which the scale factor of the universe inflated by many orders of magnitude at some very early time. Such a scenario would produce the observed large-scale isotropy and homogeneity of the universe, as well as the scale-invariant perturbations responsible for the observed (10 parts per million) anisotropies in the CMB. An inflationary epoch is also theorized to produce a background of gravitational waves (or tensor perturbations), the effects of which can be observed in the polarization of the CMB. The E-mode (or parity even) polarization of the CMB, which is produced by scalar perturbations, has now been measured with high significance. Con- trastingly, today the B-mode (or parity odd) polarization, which is sourced by tensor perturbations, has yet to be observed. A detection of the B-mode polarization of the CMB would provide strong evidence for an inflationary epoch early in the universe’s history.
In this work, we explore experimental techniques and analysis methods used to probe the B- mode polarization of the CMB. These experimental techniques have been used to build the Bicep2 telescope, which was deployed to the South Pole in 2009. After three years of observations, Bicep2 has acquired one of the deepest observations of the degree-scale polarization of the CMB to date. Similarly, this work describes analysis methods developed for the Bicep1 three-year data analysis, which includes the full data set acquired by Bicep1. This analysis has produced the tightest constraint on the B-mode polarization of the CMB to date, corresponding to a tensor-to-scalar ratio estimate of r = 0.04±0.32, or a Bayesian 95% credible interval of r < 0.70. These analysis methods, in addition to producing this new constraint, are directly applicable to future analyses of Bicep2 data. Taken together, the experimental techniques and analysis methods described herein promise to open a new observational window into the inflationary epoch and the initial conditions of our universe.
Resumo:
The study of the strength of a material is relevant to a variety of applications including automobile collisions, armor penetration and inertial confinement fusion. Although dynamic behavior of materials at high pressures and strain-rates has been studied extensively using plate impact experiments, the results provide measurements in one direction only. Material behavior that is dependent on strength is unaccounted for. The research in this study proposes two novel configurations to mitigate this problem.
The first configuration introduced is the oblique wedge experiment, which is comprised of a driver material, an angled target of interest and a backing material used to measure in-situ velocities. Upon impact, a shock wave is generated in the driver material. As the shock encounters the angled target, it is reflected back into the driver and transmitted into the target. Due to the angle of obliquity of the incident wave, a transverse wave is generated that allows the target to be subjected to shear while being compressed by the initial longitudinal shock such that the material does not slip. Using numerical simulations, this study shows that a variety of oblique wedge configurations can be used to study the shear response of materials and this can be extended to strength measurement as well. Experiments were performed on an oblique wedge setup with a copper impactor, polymethylmethacrylate driver, aluminum 6061-t6 target, and a lithium fluoride window. Particle velocities were measured using laser interferometry and results agree well with the simulations.
The second novel configuration is the y-cut quartz sandwich design, which uses the anisotropic properties of y-cut quartz to generate a shear wave that is transmitted into a thin sample. By using an anvil material to back the thin sample, particle velocities measured at the rear surface of the backing plate can be implemented to calculate the shear stress in the material and subsequently the strength. Numerical simulations were conducted to show that this configuration has the ability to measure the strength for a variety of materials.
Resumo:
The giant enhancement of Kerr nonlinearity in a four-level tripod type system is investigated theoretically. By tuning the value of the Rabi frequency of the coherent control field, owing to the double dark resonances, the giant-enhanced Kerr nonlinearity can be achieved within the right transparency window. The in fluence of Doppler broadening is also discussed.