725 resultados para Euler-Bernoulli
Resumo:
A mass‐balance model for Lake Superior was applied to polychlorinated biphenyls (PCBs), polybrominated diphenyl ethers (PBDEs), and mercury to determine the major routes of entry and the major mechanisms of loss from this ecosystem as well as the time required for each contaminant class to approach steady state. A two‐box model (water column, surface sediments) incorporating seasonally adjusted environmental parameters was used. Both numerical (forward Euler) and analytical solutions were employed and compared. For validation, the model was compared with current and historical concentrations and fluxes in the lake and sediments. Results for PCBs were similar to prior work showing that air‐water exchange is the most rapid input and loss process. The model indicates that mercury behaves similarly to a moderately‐chlorinated PCB, with air‐water exchange being a relatively rapid input and loss process. Modeled accumulation fluxes of PBDEs in sediments agreed with measured values reported in the literature. Wet deposition rates were about three times greater than dry particulate deposition rates for PBDEs. Gas deposition was an important process for tri‐ and tetra‐BDEs (BDEs 28 and 47), but not for higher‐brominated BDEs. Sediment burial was the dominant loss mechanism for most of the PBDE congeners while volatilization was still significant for tri‐ and tetra‐BDEs. Because volatilization is a relatively rapid loss process for both mercury and the most abundant PCBs (tri‐ through penta‐), the model predicts that similar times (from 2 ‐ 10 yr) are required for the compounds to approach steady state in the lake. The model predicts that if inputs of Hg(II) to the lake decrease in the future then concentrations of mercury in the lake will decrease at a rate similar to the historical decline in PCB concentrations following the ban on production and most uses in the U.S. In contrast, PBDEs are likely to respond more slowly if atmospheric concentrations are reduced in the future because loss by volatilization is a much slower process for PBDEs, leading to lesser overall loss rates for PBDEs in comparison to PCBs and mercury. Uncertainties in the chemical degradation rates and partitioning constants of PBDEs are the largest source of uncertainty in the modeled times to steady‐state for this class of chemicals. The modeled organic PBT loading rates are sensitive to uncertainties in scavenging efficiencies by rain and snow, dry deposition velocity, watershed runoff concentrations, and uncertainties in air‐water exchange such as the effect of atmospheric stability.
Resumo:
Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified.
Resumo:
Switching mode power supplies (SMPS) are subject to low power factor and high harmonic distortions. Active power-factor correction (APFC) is a technique to improve the power factor and to reduce the harmonic distortion of SMPSs. However, this technique results in double frequency output voltage variation which can be reduced by using a large output capacitance. Using large capacitors increases the cost and size of the converter. Furthermore, the capacitors are subject to frequent failures mainly caused by evaporation of the electrolytic solution which reduce the converter reliability. This thesis presents an optimal control method for the input current of a boost converter to reduce the size of the output capacitor. The optimum current waveform as a function of weighing factor is found by using the Euler Lagrange equation. A set of simulations are performed to determine the ideal weighing which gives the lowest possible output voltage variation as the converter still meets the IEC-61000-3-2 class-A harmonics requirements with a power factor of 0.8 or higher. The proposed method is verified by the experimental work. A boost converter is designed and it is run for different power levels, 100 W, 200 W and 400 W. The desired output voltage ripple is 10 V peak to peak for the output voltage of 200 Vdc. This ripple value corresponds to a ± 2.5% output voltage ripple. The experimental and the simulation results are found to be quite matching. A significant reduction in capacitor size, as high as 50%, is accomplished by using the proposed method.
Resumo:
The St. Petersburg Paradox was first presented by Nicholas Bernoulli in 1713. It is related to a gambling game whose mathematical expected payoff is infinite, but no reasonable person would pay more than $25 to play it. In the history, a number of ideas in different areas have been developed to solve this paradox, and this report will mainly focus on mathematical perspective of this paradox. Different ideas and papers will be reviewed, including both classical ones of 18th and 19th century and some latest developments. Each model will be evaluated by simulation using Mathematica.
Resumo:
Automatisiertes Kommissionieren von biegeschlaffen Teilen stellt seit jeher ein besonderes Problem für die Handhabungstechnik dar, und es sind bisher verschiedenartigste sondermaschinenbauliche Lösungen hervorgebracht worden. Auch für das Kommissionieren von in Folien verpackten Artikeln gibt es einige L��sungen. Diese sind aber hinsichtlich ihrer Anwendung für in Beuteln verpackte Güter eingeschränkt. Ein neuartiges, am Fraunhofer-IML entwickeltes Verfahren verspricht Abhilfe. Der folgende Beitrag stellt dieses Verfahren im Detail vor und zeigt vergangene und zukünftige Untersuchungsfelder auf, die im Rahmen der Entwicklung bearbeitet wurden bzw. werden und sich insbesondere mit der Dimensionierung der Gerätschaften und Hilfsmittel beschäftigen.
Resumo:
This manuscript details a technique for estimating gesture accuracy within the context of motion-based health video games using the MICROSOFT KINECT. We created a physical therapy game that requires players to imitate clinically significant reference gestures. Player performance is represented by the degree of similarity between the performed and reference gestures and is quantified by collecting the Euler angles of the player's gestures, converting them to a three-dimensional vector, and comparing the magnitude between the vectors. Lower difference values represent greater gestural correspondence and therefore greater player performance. A group of thirty-one subjects was tested. Subjects achieved gestural correspondence sufficient to complete the game's objectives while also improving their ability to perform reference gestures accurately.
Resumo:
In this study, we investigated the scaling relations between trabecular bone volume fraction (BV/TV) and parameters of the trabecular microstructure at different skeletal sites. Cylindrical bone samples with a diameter of 8mm were harvested from different skeletal sites of 154 human donors in vitro: 87 from the distal radius, 59/69 from the thoracic/lumbar spine, 51 from the femoral neck, and 83 from the greater trochanter. μCT images were obtained with an isotropic spatial resolution of 26μm. BV/TV and trabecular microstructure parameters (TbN, TbTh, TbSp, scaling indices (< > and σ of α and αz), and Minkowski Functionals (Surface, Curvature, Euler)) were computed for each sample. The regression coefficient β was determined for each skeletal site as the slope of a linear fit in the double-logarithmic representations of the correlations of BV/TV versus the respective microstructure parameter. Statistically significant correlation coefficients ranging from r=0.36 to r=0.97 were observed for BV/TV versus microstructure parameters, except for Curvature and Euler. The regression coefficients β were 0.19 to 0.23 (TbN), 0.21 to 0.30 (TbTh), −0.28 to −0.24 (TbSp), 0.58 to 0.71 (Surface) and 0.12 to 0.16 (<α>), 0.07 to 0.11 (<αz>), −0.44 to −0.30 (σ(α)), and −0.39 to −0.14 (σ(αz)) at the different skeletal sites. The 95% confidence intervals of β overlapped for almost all microstructure parameters at the different skeletal sites. The scaling relations were independent of vertebral fracture status and similar for subjects aged 60–69, 70–79, and >79years. In conclusion, the bone volume fraction–microstructure scaling relations showed a rather universal character.
Resumo:
Antegrade nailing of proximal humeral fractures using a straight nail can damage the bony insertion of the supraspinatus tendon and may lead to varus failure of the construct. In order to establish the ideal anatomical landmarks for insertion of the nail and their clinical relevance we analysed CT scans of bilateral proximal humeri in 200 patients (mean age 45.1 years (sd 19.6; 18 to 97) without humeral fractures. The entry point of the nail was defined by the point of intersection of the anteroposterior and lateral vertical axes with the cortex of the humeral head. The critical point was defined as the intersection of the sagittal axis with the medial limit of the insertion of the supraspinatus tendon on the greater tuberosity. The region of interest, i.e. the biggest entry hole that would not encroach on the insertion of the supraspinatus tendon, was calculated setting a 3 mm minimal distance from the critical point. This identified that 38.5% of the humeral heads were categorised as 'critical types', due to morphology in which the predicted offset of the entry point would encroach on the insertion of the supraspinatus tendon that may damage the tendon and reduce the stability of fixation. We therefore emphasise the need for 'fastidious' pre-operative planning to minimise this risk.
Resumo:
Voting power is commonly measured using a probability. But what kind of probability is this? Is it a degree of belief or an objective chance or some other sort of probability? The aim of this paper is to answer this question. The answer depends on the use to which a measure of voting power is put. Some objectivist interpretations of probabilities are appropriate when we employ such a measure for descriptive purposes. By contrast, when voting power is used to normatively assess voting rules, the probabilities are best understood as classical probabilities, which count possibilities. This is so because, from a normative stance, voting power is most plausibly taken to concern rights and thus possibilities. The classical interpretation also underwrites the use of the Bernoulli model upon which the Penrose/Banzhaf measure is based.
Resumo:
Let Y be a stochastic process on [0,1] satisfying dY(t)=n 1/2 f(t)dt+dW(t) , where n≥1 is a given scale parameter (`sample size'), W is standard Brownian motion and f is an unknown function. Utilizing suitable multiscale tests, we construct confidence bands for f with guaranteed given coverage probability, assuming that f is isotonic or convex. These confidence bands are computationally feasible and shown to be asymptotically sharp optimal in an appropriate sense.
Resumo:
Differential equations are equations that involve an unknown function and derivatives. Euler's method are efficient methods to yield fairly accurate approximations of the actual solutions. By manipulating such methods, one can find ways to provide good approximations compared to the exact solution of parabolic partial differential equations and nonlinear parabolic differential equations.