983 resultados para Reasonable profits
Resumo:
The Missouri River floods of 2011 will go down in history as the longest duration flooding event this state has seen to date. The combination of above normal snowfall in the upper Missouri River basin followed by the equivalent of nearly one year’s worth of rainfall in May created an above normal runoff situation which filled the Missouri River and the six main reservoirs within the basin. Compounding this problem was colder than normal temperatures which kept much of the snowpack in the upper basin on the ground longer into the spring, setting the stage for this historic event. The U.S. Army Corps of Engineers (USACE) began increasing the outflow at Gavin’s Point, near Yankton, South Dakota in May. On June 14, 2011, the outflow reached a record rate of over 160,000 cubic feet per second (cfs), over twice the previous record outflow set in 1997. This increased output from Gavin’s Point caused the Missouri River to flow out of its banks covering over 283,000 acres of land in Iowa, forcing hundreds of evacuations, damaging 255,000 acres of cropland and significantly impacting the levee system on the Missouri River basin. Over the course of the summer, approximately 64 miles of primary roads closed due to Missouri River flooding, including 54 miles of Interstate Highway. Many county secondary roads were closed by high water or overburdened due to the numerous detours and road closures in this area. As the Missouri River levels began to increase, municipalities and counties aided by State and Federal agencies began preparing for a sustained flood event. Citizens, businesses, state agencies, local governments and non‐profits made substantial preparations, in some cases expending millions of dollars on emergency protective measures to protect their facilities from the impending flood. Levee monitors detected weak spots in the levee system in all affected counties, with several levees being identified as at risk levees that could potentially fail. Of particular concern was the 28 miles of levees protecting Council Bluffs. Based on this concern, Council Bluffs prepared an evacuation plan for the approximately 30,000 residents that resided in the protected area. On May 25, 2011, Governor Branstad directed the execution of the Iowa Emergency Response Plan in accordance with Section 401 of the Stafford Act. On May 31, 2011, HSEMD Administrator, Brigadier General J. Derek Hill, formally requested the USACE to provide technical assistance and advanced measures for the communities along the Missouri River basin. On June 2, 2011 Governor Branstad issued a State of Iowa Proclamation of Disaster Emergency for Fremont, Harrison, Mills, Monona, Pottawattamie, and Woodbury counties. The length of this flood event created a unique set of challenges for Federal, State and local entities. In many cases, these organizations were conducting response and recovery operations simultaneously. Due to the length of this entire event, the State Emergency Operations Center and the local Emergency Operations Centers remained open for an extended period of time, putting additional strain on many organizations and resources. In response to this disaster, Governor Branstad created the Missouri River Recovery Coordination Task Force to oversee the State’s recovery efforts. The Governor announced the creation of this Task Force on October 17, 2011 and appointed Brigadier General J. Derek Hill, HSEMD Administrator as the chairman. This Task Force would be a temporary group of State agency representatives and interested stakeholders brought together to support the recovery efforts of the Iowa communities impacted by the Missouri River Flood. Collectively, this group would analyze and share damage assessment data, coordinate assistance across various stakeholders, monitor progress, capture best practices and identify lessons learned.
Resumo:
The performance of density-functional theory to solve the exact, nonrelativistic, many-electron problem for magnetic systems has been explored in a new implementation imposing space and spin symmetry constraints, as in ab initio wave function theory. Calculations on selected systems representative of organic diradicals, molecular magnets and antiferromagnetic solids carried out with and without these constraints lead to contradictory results, which provide numerical illustration on this usually obviated problem. It is concluded that the present exchange-correlation functionals provide reasonable numerical results although for the wrong physical reasons, thus evidencing the need for continued search for more accurate expressions.
Resumo:
The emergence of chirality in enantioselective autocatalysis for compounds unable to transform according to the Frank-like reaction network is discussed with respect to the controversial limited enantioselectivity (LES) model composed of coupled enantioselective and non-enantioselective autocatalyses. The LES model cannot lead to spontaneous mirror symmetry breaking (SMSB) either in closed systems with a homogeneous temperature distribution or in closed systems with a stationary non-uniform temperature distribution. However, simulations of chemical kinetics in a two-compartment model demonstrate that SMSB may occur if both autocatalytic reactions are spatially separated at different temperatures in different compartments but coupled under the action of a continuous internal flow. In such conditions, the system can evolve, for certain reaction and system parameters, toward a chiral stationary state; that is, the system is able to reach a bifurcation point leading to SMSB. Numerical simulations in which reasonable chemical parameters have been used suggest that an ade- quate scenario for such a SMSB would be that of abyssal hydrothermal vents, by virtue of the typical temper- ature gradients found there and the role of inorganic solids mediating chemical reactions in an enzyme-like role. Key Words: Homochirality Prebiotic chemistry.
Resumo:
The performance of density-functional theory to solve the exact, nonrelativistic, many-electron problem for magnetic systems has been explored in a new implementation imposing space and spin symmetry constraints, as in ab initio wave function theory. Calculations on selected systems representative of organic diradicals, molecular magnets and antiferromagnetic solids carried out with and without these constraints lead to contradictory results, which provide numerical illustration on this usually obviated problem. It is concluded that the present exchange-correlation functionals provide reasonable numerical results although for the wrong physical reasons, thus evidencing the need for continued search for more accurate expressions.
Resumo:
Most of the large firms organization schemes consist in hierarchical structures of tiers with different wage levels. Traditionally the existence of this kind of organizations has been associated to the separation of productive and managerial or supervision tasks and to differences in the skills of the workers. However, many firms now employ workers with similar skills, and then the hierarchical structure can be related to an incentive scheme to ensure that workers supply effort. The model we present investigates how firm owners should determine the optimal wage distribution in order to maximize profits.
Resumo:
Summary: Total hip prosthesis, a reasonable treatment for dogs with hip problems
Resumo:
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Resumo:
Any transportation infrastructure system is inherently concerned with durability and performance issues. The proportioning and uniformity control of concrete mixtures are critical factors that directly affect the longevity and performance of the portland cement concrete pavement systems. At present, the only means available to monitor mix proportions of any given batch are to track batch tickets created at the batch plant. However, this does not take into account potential errors in loading materials into storage silos, calibration errors, and addition of water after dispatch. Therefore, there is a need for a rapid, cost-effective, and reliable field test that estimates the proportions of as-delivered concrete mixtures. In addition, performance based specifications will be more easily implemented if there is a way to readily demonstrate whether any given batch is similar to the proportions already accepted based on laboratory performance testing. The goal of the present research project is to investigate the potential use of a portable x-ray fluorescence (XRF) technique to assess the proportions of concrete mixtures as they are delivered. Tests were conducted on the raw materials, paste and mortar samples using a portable XRF device. There is a reasonable correlation between the actual and calculated mix proportions of the paste samples, but data on mortar samples was less reliable.
Resumo:
Debris flows are among the most dangerous processes in mountainous areas due to their rapid rate of movement and long runout zone. Sudden and rather unexpected impacts produce not only damages to buildings and infrastructure but also threaten human lives. Medium- to regional-scale susceptibility analyses allow the identification of the most endangered areas and suggest where further detailed studies have to be carried out. Since data availability for larger regions is mostly the key limiting factor, empirical models with low data requirements are suitable for first overviews. In this study a susceptibility analysis was carried out for the Barcelonnette Basin, situated in the southern French Alps. By means of a methodology based on empirical rules for source identification and the empirical angle of reach concept for the 2-D runout computation, a worst-case scenario was first modelled. In a second step, scenarios for high, medium and low frequency events were developed. A comparison with the footprints of a few mapped events indicates reasonable results but suggests a high dependency on the quality of the digital elevation model. This fact emphasises the need for a careful interpretation of the results while remaining conscious of the inherent assumptions of the model used and quality of the input data.
Resumo:
The objective of this research is to determine whether the nationally calibrated performance models used in the Mechanistic-Empirical Pavement Design Guide (MEPDG) provide a reasonable prediction of actual field performance, and if the desired accuracy or correspondence exists between predicted and monitored performance for Iowa conditions. A comprehensive literature review was conducted to identify the MEPDG input parameters and the MEPDG verification/calibration process. Sensitivities of MEPDG input parameters to predictions were studied using different versions of the MEPDG software. Based on literature review and sensitivity analysis, a detailed verification procedure was developed. A total of sixteen different types of pavement sections across Iowa, not used for national calibration in NCHRP 1-47A, were selected. A database of MEPDG inputs and the actual pavement performance measures for the selected pavement sites were prepared for verification. The accuracy of the MEPDG performance models for Iowa conditions was statistically evaluated. The verification testing showed promising results in terms of MEPDG’s performance prediction accuracy for Iowa conditions. Recalibrating the MEPDG performance models for Iowa conditions is recommended to improve the accuracy of predictions. ****************** Large File**************************
Resumo:
In 2011 several articles seemed significant for the practice of general medicine. Diagnosis of hypertension needs several measurements and may need 24-hour ambulatory blood pressure monitoring. Glycosylated hemoglobin is a reliable tool to diagnose diabetes mellitus. The ABCD2 score with neurological imaging help the triage of transient ischemic attacks. Pulmonary embolism can be treated as outpatient for low risk patients. Gluten-free diet may be tried in irritable bowel syndrome. Nitrofurantoin is a reasonable alternative for simple urinary tract infection in women, but antibiotics are not needed after drainage of an uncomplicated skin abscess. Subclinical thyroid dysfunction is a risk factor of osteoporosis in older men. Sequential use of MMSE and ACE scores is a promising approach to assess medical decision-making capacity.
Resumo:
As the list of states adopting the HWTD continues to grow, there is a need to evaluate how results are utilized. AASHTO T 324 does not standardize the analysis and reporting of test results. Furthermore, processing and reporting of the results among manufacturers is not uniform. This is partly due to the variation among agency reporting requirements. Some include only the midpoint rut depth, while others include the average across the entire length of the wheel track. To eliminate bias in reporting, statistical analysis was performed on over 150 test runs on gyratory specimens. Measurement location was found to be a source of significant variation in the HWTD. This is likely due to the nonuniform wheel speed across the specimen, geometry of the specimen, and air void profile. Eliminating this source of bias when reporting results is feasible though is dependent upon the average rut depth at the final pass. When reporting rut depth at the final pass, it is suggested for poor performing samples to average measurement locations near the interface of the adjoining gyratory specimens. This is necessary due to the wheel lipping on the mold. For all other samples it is reasonable to only eliminate the 3 locations furthest from the gear house. For multi‐wheel units, wheel side was also found to be significant for poor and good performing samples. After eliminating the suggested measurements from the analysis, the wheel was no longer a significant source of variation.
Resumo:
Application of semi-distributed hydrological models to large, heterogeneous watersheds deals with several problems. On one hand, the spatial and temporal variability in catchment features should be adequately represented in the model parameterization, while maintaining the model complexity in an acceptable level to take advantage of state-of-the-art calibration techniques. On the other hand, model complexity enhances uncertainty in adjusted model parameter values, therefore increasing uncertainty in the water routing across the watershed. This is critical for water quality applications, where not only streamflow, but also a reliable estimation of the surface versus subsurface contributions to the runoff is needed. In this study, we show how a regularized inversion procedure combined with a multiobjective function calibration strategy successfully solves the parameterization of a complex application of a water quality-oriented hydrological model. The final value of several optimized parameters showed significant and consistentdifferences across geological and landscape features. Although the number of optimized parameters was significantly increased by the spatial and temporal discretization of adjustable parameters, the uncertainty in water routing results remained at reasonable values. In addition, a stepwise numerical analysis showed that the effects on calibration performance due to inclusion of different data types in the objective function could be inextricably linked. Thus caution should be taken when adding or removing data from an aggregated objective function.
Resumo:
BACKGROUND: In the presence of pigmented iris lesions evocative of malignant melanoma and implying oncological treatment, a foregoing biopsy to exclude a benign lesion may seem a reasonable approach. After examining patient files, the utility of such a diagnostic approach was explored. MATERIAL AND METHODS: Retrospective, consecutive histopathologic case series of 10 pigmented iris tumor specimens excised since 1993. Histopathologic diagnosis was compared with final diagnosis and outcome in the patient's medical chart. RESULTS: Five biopsies had only nevus cells, whereas ulterior clinical data or histopathologic examinations were compatible with the diagnosis of malignant melanoma. One biopsy contained insufficient sample tissue. Four biopsies confirmed clinical suspicion of iris melanoma. CONCLUSION: In the current case series, 6 out of 10 biopsies provided a falsely reassuring negative or an inconclusive result. Modern management techniques such as ultrasound biomicroscopy and proton therapy of the whole anterior segment have equally diminished indications for a biopsy. In cases clinically evocative of iris melanoma, a biopsy has only a relative value.
Resumo:
Despite the heavy burden of tobacco-related problems in alcohol-dependent patients, little effort has been directed toward reducing the prevalence of smoking in these patients. It seems reasonable to develop nicotine addiction treatments for alcohol-dependent patients based on the smoker's stage of change. To assess the stage of change for tobacco consumption and possible quitting barriers in alcohol-dependent patients, 88 consecutively admitted inpatients of a Swiss university-affiliated alcohol withdrawal clinic were interviewed with a semistructured schedule. More than half of the alcohol-dependent smokers (50.7%) considered the possibility of smoking cessation or had already decided to stop, although the majority (83.1%) were highly dependent smokers. Positive reinforcers were factors influencing motivation both to stop smoking as well as to continue smoking, whereas negative reinforcers had no influence. As recovering alcoholic patients are often interested in smoking cessation and the introduction of nicotine treatment interventions has been shown not to jeopardize the outcome of alcohol treatment, alcohol treatment programs should include counseling for smoking cessation. Education and training for staff is essential, as their beliefs and habits remain an important barrier.