977 resultados para Mindlin Pseudospectral Plate Element, Chebyshev Polynomial, Integration Scheme
Resumo:
Moldova’s progress in its negotiations on an Association Agreement with the European Union, with a Deep and Comprehensive Free Trade Area (DCFTA) as its key element, has become a source of tension between Chisinau and the breakaway Republic of Transnistria. An almost certain refusal by Transnistria to join the DCFTA, will deprive the region of the benefits it currently enjoys under the EU Autonomous Trade Preferences (ATP) worsening its already precarious economic situation. It is to be expected that the issue will become an additional source of tension between the two sides of the Transnistrian conflict, and might also have a negative impact on the EU–Russia relationship. The signing of the Association Agreement, which is scheduled for the autumn of 2013, will be an important step towards Moldova’s integration with the EU. Both sides assign great importance to the speediest possible finalisation of the Agreement, and so far the negotiations have been described as progressing very smoothly. Transnistria’s highly sceptical attitude towards its possible accession to the DCFTA, however, is consistent with the interests of its main ally, Moscow. It is highly probable that Russia intends to thwart Moldova’s EU association process. Moscow’s objective seems to be to draw Moldova permanently into its own sphere of influence, and therefore it perceives Chisinau’s movement towards the EU as a transgression against its geopolitical interests. Consequently, in order to hinder this process, Russia may instrumentally exploit its extensive influence over Transnistria to provoke a crisis between Tiraspol and Chisinau. An apparent increase in Russian presence in the region over the last few months (including tighter control over Transnistria’s KGB and the Ministry of Information) may suggest that the Kremlin is preparing to implement such a scenario.
Resumo:
This package includes various Mata functions. kern(): various kernel functions; kint(): kernel integral functions; kdel0(): canonical bandwidth of kernel; quantile(): quantile function; median(): median; iqrange(): inter-quartile range; ecdf(): cumulative distribution function; relrank(): grade transformation; ranks(): ranks/cumulative frequencies; freq(): compute frequency counts; histogram(): produce histogram data; mgof(): multinomial goodness-of-fit tests; collapse(): summary statistics by subgroups; _collapse(): summary statistics by subgroups; gini(): Gini coefficient; sample(): draw random sample; srswr(): SRS with replacement; srswor(): SRS without replacement; upswr(): UPS with replacement; upswor(): UPS without replacement; bs(): bootstrap estimation; bs2(): bootstrap estimation; bs_report(): report bootstrap results; jk(): jackknife estimation; jk_report(): report jackknife results; subset(): obtain subsets, one at a time; composition(): obtain compositions, one by one; ncompositions(): determine number of compositions; partition(): obtain partitions, one at a time; npartitionss(): determine number of partitions; rsubset(): draw random subset; rcomposition(): draw random composition; colvar(): variance, by column; meancolvar(): mean and variance, by column; variance0(): population variance; meanvariance0(): mean and population variance; mse(): mean squared error; colmse(): mean squared error, by column; sse(): sum of squared errors; colsse(): sum of squared errors, by column; benford(): Benford distribution; cauchy(): cumulative Cauchy-Lorentz dist.; cauchyden(): Cauchy-Lorentz density; cauchytail(): reverse cumulative Cauchy-Lorentz; invcauchy(): inverse cumulative Cauchy-Lorentz; rbinomial(): generate binomial random numbers; cebinomial(): cond. expect. of binomial r.v.; root(): Brent's univariate zero finder; nrroot(): Newton-Raphson zero finder; finvert(): univariate function inverter; integrate_sr(): univariate function integration (Simpson's rule); integrate_38(): univariate function integration (Simpson's 3/8 rule); ipolate(): linear interpolation; polint(): polynomial inter-/extrapolation; plot(): Draw twoway plot; _plot(): Draw twoway plot; panels(): identify nested panel structure; _panels(): identify panel sizes; npanels(): identify number of panels; nunique(): count number of distinct values; nuniqrows(): count number of unique rows; isconstant(): whether matrix is constant; nobs(): number of observations; colrunsum(): running sum of each column; linbin(): linear binning; fastlinbin(): fast linear binning; exactbin(): exact binning; makegrid(): equally spaced grid points; cut(): categorize data vector; posof(): find element in vector; which(): positions of nonzero elements; locate(): search an ordered vector; hunt(): consecutive search; cond(): matrix conditional operator; expand(): duplicate single rows/columns; _expand(): duplicate rows/columns in place; repeat(): duplicate contents as a whole; _repeat(): duplicate contents in place; unorder2(): stable version of unorder(); jumble2(): stable version of jumble(); _jumble2(): stable version of _jumble(); pieces(): break string into pieces; npieces(): count number of pieces; _npieces(): count number of pieces; invtokens(): reverse of tokens(); realofstr(): convert string into real; strexpand(): expand string argument; matlist(): display a (real) matrix; insheet(): read spreadsheet file; infile(): read free-format file; outsheet(): write spreadsheet file; callf(): pass optional args to function; callf_setup(): setup for mm_callf().
Resumo:
New Sr- Nd- and Pb-isotopic and trace element data are presented on basalts from the Sulu and Celebes Basins, and the submerged Cagayan Ridge Arc (Western Pacific), recently sampled during Ocean Drilling Program Leg 124. Drilling has shown that the Sulu Basin developed about 18 Ma ago as a backarc basin, associated with the now submerged Cagayan Ridge Arc, whereas the Celebes Basin was generated about 43 Ma ago, contemporaneous with a general plate reorganisation in the Western Pacifc, subsequently developing as an open ocean receiving pelagic sediments until the middle Miocene. In both basins, a late middle Miocene collision phase and the onset of volcanic activity on adjacent arcs in the late Miocene are recorded. Covariations between 87Sr/86Sr and 143Nd/144Nd show that the seafoor basalts from both the Sulu and Celebes Basins are isotopically similar to depleted Indian mid-ocean ridge basalts (MORB), and distinct from East Pacifc Rise MORB, defining a single negative correlation. The Cagayan Arc volcanics are different, in that they have distinctly lower epsilon-Ne(T) for a given epsilon-Sr(T), compared to Sulu and Celebes basalts. In the 207Pb/204Pb and 208Pb/204Pb versus 206Pb/204Pb diagrams, the Celebes, Sulu and Cagayan rocks all plot distinctly above the Northern Hemisphere Reference Line, with high Delta 7/4 Pb (5.3-9.3) and Delta 8/4 Pb (46.3-68.1) values. They define a single trend of radiogenic lead enrichment from Celebes through Sulu to Cagayan Ridge, within the Indian Ocean MORB data field. The data suggest that the overall chemical and isotopic features of the Sulu, Cagayan and Celebes rocks may be explained by partial melting of a depleted asthenospheric N-MORB-type ("normal") mantle source with isotopic characteristics similar to those of the Indian Ocean MORB source. This asthenospheric source was slightly heterogeneous, giving rise to the Sr-Nd isotopic differences between the Celebes and Sulu basalts, and the Cagayan Ridge volcanics. In addition, a probably slab-derived component enriched in LILE and LREE is required to generate the elemental characteristics and low Ne(T) of the Cagayan Ridge island arc tholeiitic and calcalkaline lavas, and to contribute to a small extent in the backarc basalts of the Sulu Sea. The results of this study confirm and extend the widespread Indian Ocean MORB signature in the Western Pacifc region. This signature could have been inherited by the Indian Ocean mantle itself during the rupture of Gondwanaland, when fragments of this mantle could have migrated towards the present position of the Celebes, Sulu and Cagayan sources.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-03
Resumo:
Distortional buckling, unlike the usual lateral-torsional buckling in which the cross-section remains rigid in its own plane, involves distortion of web in the cross-section. This type of buckling typically occurs in beams with slender web and stocky flanges. Most of the published studies assume the web to deform with a cubic shape function. As this assumption may limit the accuracy of the results, a fifth order polynomial is chosen here for the web displacements. The general line-type finite element model used here has two nodes and a maximum of twelve degrees of freedom per node. The model not only can predict the correct coupled mode but also is capable of handling the local buckling of the web.
Resumo:
This article presents an array antenna with beam-steering capability in azimuth over a wide frequency band using real-valued weighting coefficients that can be realized in practice by amplifiers or attenuators. The described beamforming scheme relies on a 2D (instead of 1D) array structure in order to make sure that there are enough degrees of freedom to realize a given radiation pattern in both the angular and frequency domains. In the presented approach, weights are determined using an inverse discrete Fourier transform (IDFT) technique by neglecting the mutual coupling between array elements. Because of the presence of mutual coupling, the actual array produces a radiation pattern with increased side-lobe levels. In order to counter this effect, the design aims to realize the initial radiation pattern with a lower side-lobe level. This strategy is demonstrated in the design example of 4 X 4 element array. (C) 2005 Wiley Periodicals. Inc.
Resumo:
The degree to which Southern Hemisphere climatic changes during the end of the last glacial period and early Holocene (30-8 ka) were influenced or initiated by events occurring in the high latitudes of the Northern Hemisphere is a complex issue. There is conflicting evidence for the degree of hemispheric 'teleconnection' and an unresolved debate as to the principle forcing mechanism(s). The available hypotheses are difficult to test robustly, however, because the few detailed palaeoclimatic records in the Southern Hemisphere are widely dispersed and lack duplication. Here we present climatic and environmental reconstructions from across Australia, a key region of the Southern Hemisphere because of the range of environments it covers and the potentially important role regional atmospheric and oceanic controls play in global climate change. We identify a general scheme of events for the end of the last glacial period and early Holocene but a detailed reconstruction proved problematic. Significant progress in climate quantification and geochronological control is now urgently required to robustly investigate change through this period. Copyright (c) 2006 John Wiley & Sons, Ltd.
Resumo:
In mantle convection models it has become common to make use of a modified (pressure sensitive, Boussinesq) von Mises yield criterion to limit the maximum stress the lithosphere can support. This approach allows the viscous, cool thermal boundary layer to deform in a relatively plate-like mode even in a fully Eulerian representation. In large-scale models with embedded continental crust where the mobile boundary layer represents the oceanic lithosphere, the von Mises yield criterion for the oceans ensures that the continents experience a realistic broad-scale stress regime. In detailed models of crustal deformation it is, however, more appropriate to choose a Mohr-Coulomb yield criterion based upon the idea that frictional slip occurs on whichever one of many randomly oriented planes happens to be favorably oriented with respect to the stress field. As coupled crust/mantle models become more sophisticated it is important to be able to use whichever failure model is appropriate to a given part of the system. We have therefore developed a way to represent Mohr-Coulomb failure within a code which is suited to mantle convection problems coupled to large-scale crustal deformation. Our approach uses an orthotropic viscous rheology (a different viscosity for pure shear to that for simple shear) to define a prefered plane for slip to occur given the local stress field. The simple-shear viscosity and the deformation can then be iterated to ensure that the yield criterion is always satisfied. We again assume the Boussinesq approximation - neglecting any effect of dilatancy on the stress field. An additional criterion is required to ensure that deformation occurs along the plane aligned with maximum shear strain-rate rather than the perpendicular plane which is formally equivalent in any symmetric formulation. It is also important to allow strain-weakening of the material. The material should remember both the accumulated failure history and the direction of failure. We have included this capacity in a Lagrangian-Integration-point finite element code and will show a number of examples of extension and compression of a crustal block with a Mohr-Coulomb failure criterion, and comparisons between mantle convection models using the von Mises versus the Mohr-Coulomb yield criteria. The formulation itself is general and applies to 2D and 3D problems, although it is somewhat more complicated to identify the slip plane in 3D.
Resumo:
This thesis concerns mixed flows (which are characterized by the simultaneous occurrence of free-surface and pressurized flow in sewers, tunnels, culverts or under bridges), and contributes to the improvement of the existing numerical tools for modelling these phenomena. The classic Preissmann slot approach is selected due to its simplicity and capability of predicting results comparable to those of a more recent and complex two-equation model, as shown here with reference to a laboratory test case. In order to enhance the computational efficiency, a local time stepping strategy is implemented in a shock-capturing Godunov-type finite volume numerical scheme for the integration of the de Saint-Venant equations. The results of different numerical tests show that local time stepping reduces run time significantly (between −29% and −85% CPU time for the test cases considered) compared to the conventional global time stepping, especially when only a small region of the flow field is surcharged, while solution accuracy and mass conservation are not impaired. The second part of this thesis is devoted to the modelling of the hydraulic effects of potentially pressurized structures, such as bridges and culverts, inserted in open channel domains. To this aim, a two-dimensional mixed flow model is developed first. The classic conservative formulation of the 2D shallow water equations for free-surface flow is adapted by assuming that two fictitious vertical slots, normally intersecting, are added on the ceiling of each integration element. Numerical results show that this schematization is suitable for the prediction of 2D flooding phenomena in which the pressurization of crossing structures can be expected. Given that the Preissmann model does not allow for the possibility of bridge overtopping, a one-dimensional model is also presented in this thesis to handle this particular condition. The flows below and above the deck are considered as parallel, and linked to the upstream and downstream reaches of the channel by introducing suitable internal boundary conditions. The comparison with experimental data and with the results of HEC-RAS simulations shows that the proposed model can be a useful and effective tool for predicting overtopping and backwater effects induced by the presence of bridges and culverts.
Resumo:
Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.
Resumo:
The research comprises a suite of studies that examines and develops the Lead Authority Partnership Scheme (LAPS) as a central intervention strategy for health and safety by local authority (LA) enforcers. Partnership working is a regulatory concept that in recent years has become more popular but there has been little research conducted to investigate, explore and evaluate its practical application. The study reviewed two contrasting approaches to partnership working between LAs and businesses, both of which were intended to secure improvements in the consistency of enforcement by the regulators and in the health and safety management systems of the participating businesses. The first was a well-established and highly prescriptive approach that required a substantial resource commitment on the part of the LA responsible for conducting a safety management review (SMR) of the business. As a result of his evaluation of the existing ‘full SMR’ scheme, the author developed a second, more flexible approach to partnership working. The research framework was based upon a primarily qualitative methodology intended to investigate and explore the impact of the new flexible arrangements for partnership working. The findings from this study of the flexible development of the scheme were compared and contrasted with those from studies of the established ‘full SMR’ scheme. A substantial degree of triangulation was applied in an attempt to strengthen validity and broaden applicability of the research findings. Key informant interviews, participant observation, document/archive reviews, questionnaires and surveys all their particular part to play in the overall study. The findings from this research revealed that LAPS failed to deliver consistency of LA enforcement across multiple-outlet businesses and the LA enforced business sectors. Improvement was however apparent in the safety management systems of the businesses participating in LAPS. Trust between LA inspector and safety professional was key to the success of the partnerships as was the commitment of these key individuals. Competition for precious LA resources, the priority afforded to food safety over health and safety, the perceived high resource demands of LAPS, and the structure and culture of LAs were identified as significant barriers to LA participation. Flexible approaches, whilst addressing the resource issues, introduced some fresh concerns relating to credibility and delivery. Over and above the stated aims of the scheme, LAs and businesses had their own reasons for participation, notably the personal development of individuals and kudos for the organisation. The research has explored the wider implications for partnership working with the overall conclusion it is most appropriately seen as a strategic level element within a broader structured intervention strategy.
Resumo:
The visual system pools information from local samples to calculate textural properties. We used a novel stimulus to investigate how signals are combined to improve estimates of global orientation. Stimuli were 29 × 29 element arrays of 4 c/deg log Gabors, spaced 1° apart. A proportion of these elements had a coherent orientation (horizontal/vertical) with the remainder assigned random orientations. The observer's task was to identify the global orientation. The spatial configuration of the signal was modulated by a checkerboard pattern of square checks containing potential signal elements. The other locations contained either randomly oriented elements (''noise check'') or were blank (''blank check''). The distribution of signal elements was manipulated by varying the size and location of the checks within a fixed-diameter stimulus. An ideal detector would only pool responses from potential signal elements. Humans did this for medium check sizes and for large check sizes when a signal was presented in the fovea. For small check sizes, however, the pooling occurred indiscriminately over relevant and irrelevant locations. For these check sizes, thresholds for the noise check and blank check conditions were similar, suggesting that the limiting noise is not induced by the response to the noise elements. The results are described by a model that filters the stimulus at the potential target orientations and then combines the signals over space in two stages. The first is a mandatory integration of local signals over a fixed area, limited by internal noise at each location. The second is a taskdependent combination of the outputs from the first stage. © 2014 ARVO.
Resumo:
The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.