960 resultados para due process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The synchronization of dynamic multileaf collimator (DMLC) response with respiratory motion is critical to ensure the accuracy of DMLC-based four dimensional (4D) radiation delivery. In practice, however, a finite time delay (response time) between the acquisition of tumor position and multileaf collimator response necessitates predictive models of respiratory tumor motion to synchronize radiation delivery. Predicting a complex process such as respiratory motion introduces geometric errors, which have been reported in several publications. However, the dosimetric effect of such errors on 4D radiation delivery has not yet been investigated. Thus, our aim in this work was to quantify the dosimetric effects of geometric error due to prediction under several different conditions. Conformal and intensity modulated radiation therapy (IMRT) plans for a lung patient were generated for anterior-posterior/posterior-anterior (AP/PA) beam arrangements at 6 and 18 MV energies to provide planned dose distributions. Respiratory motion data was obtained from 60 diaphragm-motion fluoroscopy recordings from five patients. A linear adaptive filter was employed to predict the tumor position. The geometric error of prediction was defined as the absolute difference between predicted and actual positions at each diaphragm position. Distributions of geometric error of prediction were obtained for all of the respiratory motion data. Planned dose distributions were then convolved with distributions for the geometric error of prediction to obtain convolved dose distributions. The dosimetric effect of such geometric errors was determined as a function of several variables: response time (0-0.6 s), beam energy (6/18 MV), treatment delivery (3D/4D), treatment type (conformal/IMRT), beam direction (AP/PA), and breathing training type (free breathing/audio instruction/visual feedback). Dose difference and distance-to-agreement analysis was employed to quantify results. Based on our data, the dosimetric impact of prediction (a) increased with response time, (b) was larger for 3D radiation therapy as compared with 4D radiation therapy, (c) was relatively insensitive to change in beam energy and beam direction, (d) was greater for IMRT distributions as compared with conformal distributions, (e) was smaller than the dosimetric impact of latency, and (f) was greatest for respiration motion with audio instructions, followed by visual feedback and free breathing. Geometric errors of prediction that occur during 4D radiation delivery introduce dosimetric errors that are dependent on several factors, such as response time, treatment-delivery type, and beam energy. Even for relatively small response times of 0.6 s into the future, dosimetric errors due to prediction could approach delivery errors when respiratory motion is not accounted for at all. To reduce the dosimetric impact, better predictive models and/or shorter response times are required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biofuels are alternative fuels that have the promise of reducing reliance on imported fossil fuels and decreasing emission of greenhouse gases from energy consumption. This thesis analyses the environmental impacts focusing on the greenhouse gas (GHG) emissions associated with the production and delivery of biofuel using the new Integrated Hydropyrolysis and Hydroconversion (IH2) process. The IH2 process is an innovative process for the conversion of woody biomass into hydrocarbon liquid transportation fuels in the range of gasoline and diesel. A cradle-to-grave life cycle assessment (LCA) was used to calculate the greenhouse gas emissions associated with diverse feedstocks production systems and delivery to the IH2 facility plus producing and using these new renewable liquid fuels. The biomass feedstocks analyzed include algae (microalgae), bagasse from a sugar cane-producing locations such as Brazil or extreme southern US, corn stover from Midwest US locations, and forest feedstocks from a northern Wisconsin location. The life cycle greenhouse gas (GHG) emissions savings of 58%–98% were calculated for IH2 gasoline and diesel production and combustion use in vehicles compared to fossil fuels. The range of savings is due to different biomass feedstocks and transportation modes and distances. Different scenarios were conducted to understand the uncertainties in certain input data to the LCA model, particularly in the feedstock production section, the IH2 biofuel production section, and transportation sections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the demand for miniature products and components continues to increase, the need for manufacturing processes to provide these products and components has also increased. To meet this need, successful macroscale processes are being scaled down and applied at the microscale. Unfortunately, many challenges have been experienced when directly scaling down macro processes. Initially, frictional effects were believed to be the largest challenge encountered. However, in recent studies it has been found that the greatest challenge encountered has been with size effects. Size effect is a broad term that largely refers to the thickness of the material being formed and how this thickness directly affects the product dimensions and manufacturability. At the microscale, the thickness becomes critical due to the reduced number of grains. When surface contact between the forming tools and the material blanks occur at the macroscale, there is enough material (hundreds of layers of material grains) across the blank thickness to compensate for material flow and the effect of grain orientation. At the microscale, there may be under 10 grains across the blank thickness. With a decreased amount of grains across the thickness, the influence of the grain size, shape and orientation is significant. Any material defects (either natural occurring or ones that occur as a result of the material preparation) have a significant role in altering the forming potential. To date, various micro metal forming and micro materials testing equipment setups have been constructed at the Michigan Tech lab. Initially, the research focus was to create a micro deep drawing setup to potentially build micro sensor encapsulation housings. The research focus shifted to micro metal materials testing equipment setups. These include the construction and testing of the following setups: a micro mechanical bulge test, a micro sheet tension test (testing micro tensile bars), a micro strain analysis (with the use of optical lithography and chemical etching) and a micro sheet hydroforming bulge test. Recently, the focus has shifted to study a micro tube hydroforming process. The intent is to target fuel cells, medical, and sensor encapsulation applications. While the tube hydroforming process is widely understood at the macroscale, the microscale process also offers some significant challenges in terms of size effects. Current work is being conducted in applying direct current to enhance micro tube hydroforming formability. Initially, adding direct current to various metal forming operations has shown some phenomenal results. The focus of current research is to determine the validity of this process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a 1-D process scale model used to investigate the chemical dynamics and temporal variability of nitrogen oxides (NOx) and ozone (O3) within and above snowpack at Summit, Greenland for March-May 2009 and estimates surface exchange of NOx between the snowpack and surface layer in April-May 2009. The model assumes the surface of snowflakes have a Liquid Like Layer (LLL) where aqueous chemistry occurs and interacts with the interstitial air of the snowpack. Model parameters and initialization are physically and chemically representative of snowpack at Summit, Greenland and model results are compared to measurements of NOx and O3 collected by our group at Summit, Greenland from 2008-2010. The model paired with measurements confirmed the main hypothesis in literature that photolysis of nitrate on the surface of snowflakes is responsible for nitrogen dioxide (NO2) production in the top ~50 cm of the snowpack at solar noon for March – May time periods in 2009. Nighttime peaks of NO2 in the snowpack for April and May were reproduced with aqueous formation of peroxynitric acid (HNO4) in the top ~50 cm of the snowpack with subsequent mass transfer to the gas phase, decomposition to form NO2 at nighttime, and transportation of the NO2 to depths of 2 meters. Modeled production of HNO4 was hindered in March 2009 due to the low production of its precursor, hydroperoxy radical, resulting in underestimation of nighttime NO2 in the snowpack for March 2009. The aqueous reaction of O3 with formic acid was the major sync of O3 in the snowpack for March-May, 2009. Nitrogen monoxide (NO) production in the top ~50 cm of the snowpack is related to the photolysis of NO2, which underrepresents NO in May of 2009. Modeled surface exchange of NOx in April and May are on the order of 1011 molecules m-2 s-1. Removal of measured downward fluxes of NO and NO2 in measured fluxes resulted in agreement between measured NOx fluxes and modeled surface exchange in April and an order of magnitude deviation in May. Modeled transport of NOx above the snowpack in May shows an order of magnitude increase of NOx fluxes in the first 50 cm of the snowpack and is attributed to the production of NO2 during the day from the thermal decomposition and photolysis of peroxynitric acid with minor contributions of NO from HONO photolysis in the early morning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lesions of the ventromedial prefrontal cortex can result in a deficient decision-making behavior. So far, most experimental results in the neuropsychological decision-making research have been obtained with gambling tasks. Due to their high complexity, it is difficult to evaluate the underlying processes of the decision-making deficits. The aim of this study was to assess if patients with ventromedial prefrontal damage compared to patients with dorsolateral prefrontal damage and controls show a deficit in an early stage of the decision-making process. Nine patients with ventromedial prefrontal damage, three with dorsolateral prefrontal damage, and eleven healthy controls were tested with a newly developed decision task in which they had to search actively for the information they needed for their decisions. Our results show that patients with ventromedial prefrontal damage compared to the brain-lesioned dorsolateral prefrontal control group and healthy controls searched less for information with regard to risk defusing operators or consequences of their decisions indicating impairment already in the early stage of the decision-making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of this study was to evaluate the effect that mechanical stresses acting under the slipping driving wheels of agricultural equipment have on the soil’s pore system and water flow process (surface runoff generation during extreme event). The field experiment simulated low slip (1%) and high slip (27%) on a clay loam. The stress on the soil surface and changes in the amounts of water flowing from macropores were simulated using the Tires/tracks And Soil Compaction (TASC) tool and the MACRO model, respectively. Taking a 65 kW tractor on a clay loam as a reference, results showed that an increase in slip of the rear wheels from 1% to 27% caused normal stress to increase from 90.6 kPa to 104.4 kPa at the topsoil level, and the maximum shear contact stress to rise drastically from 6.0 kPa to 61.6 kPa. At 27% slip, topsoil was sheared and displaced over a distance of 0.35 m. Excessive normal and shear stress values with high slip caused severe reductions of the soil’s macroporosity, saturated hydraulic conductivity, and water quantities flowing from topsoil macropores. Assuming that, under conditions of intense rainfall on sloping land, a loss in vertical water flow would mean an increase in surface runoff, we calculated that a rainfall intensity of 100 mm h-1 and a rainfall duration of 1 h would increase the runoff coefficient to 0.79 at low slip and to 1.00 at high slip, indicating that 100% of rainwater would be transformed into surface runoff at high slip. We expect that these effects have a significant impact on soil erosion and floods in steeper terrain (slope > 15°) and across larger surface areas (> 16 m2) than those included in our study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colorectal cancer is the forth most common diagnosed cancer in the United States. Every year about a hundred forty-seven thousand people will be diagnosed with colorectal cancer and fifty-six thousand people lose their lives due to this disease. Most of the hereditary nonpolyposis colorectal cancer (HNPCC) and 12% of the sporadic colorectal cancer show microsatellite instability. Colorectal cancer is a multistep progressive disease. It starts from a mutation in a normal colorectal cell and grows into a clone of cells that further accumulates mutations and finally develops into a malignant tumor. In terms of molecular evolution, the process of colorectal tumor progression represents the acquisition of sequential mutations. ^ Clinical studies use biomarkers such as microsatellite or single nucleotide polymorphisms (SNPs) to study mutation frequencies in colorectal cancer. Microsatellite data obtained from single genome equivalent PCR or small pool PCR can be used to infer tumor progression. Since tumor progression is similar to population evolution, we used an approach known as coalescent, which is well established in population genetics, to analyze this type of data. Coalescent theory has been known to infer the sample's evolutionary path through the analysis of microsatellite data. ^ The simulation results indicate that the constant population size pattern and the rapid tumor growth pattern have different genetic polymorphic patterns. The simulation results were compared with experimental data collected from HNPCC patients. The preliminary result shows the mutation rate in 6 HNPCC patients range from 0.001 to 0.01. The patients' polymorphic patterns are similar to the constant population size pattern which implies the tumor progression is through multilineage persistence instead of clonal sequential evolution. The results should be further verified using a larger dataset. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the past two decades in Thailand, non-governmental actors, such as NGOs, intellectuals, and people's organizations, have found widening opportunities to participate in policy formation and in the implementation of local development. The government has facilitated the formation of civil society forums, in the expectation of influencing local-level governance. The last two national five-year development plans were formulated after taking into account the voices of people in the provinces. Even though they may seem petty, some state funds are now transmitted through non-governmental institutions for policy implementation at the grassroots level. These changes have their origin in a reformation of rural development administration in early 1980s. This reformation in due course led to policies that have allowed the participation of non-governmental actors. Meanwhile, rural people have proved their ability to engage in participatory development by forming various local organizations, while NGOs have grown to be proficient facilitators of local development. This paper describes the process whereby three leading actors, namely the government, local people, and the NGOs, have interacted to bring about a more participatory system of local development administration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Under the process of transition toward a market economy, the economic connections of the Russian Far East (RFE) with external regions changed from a division of labor among the regions of the USSR (Russia) to an international division of labor. This happened due to factors including the liberalization of the trade system away from a state monopoly, the presence of rich natural resources and of developed industries related to these resources, the advantage of geographically proximity to Asia-Pacific countries, and the political and economic division of the once unified national economic space during the process of transition. The economic connections of RFE with external economies changed radically under the transition toward the market economy. First, the value of foreign trade increased dramatically and the importance of foreign trade for the RFE economy increased enormously. Second, however, different territories of RFE traveled along different trajectories, due to factors involving their industrial structure and geographical conditions. Third, in recent years connections with China, in the areas of both exports and imports, have grown. Fourth, the share within exports of "fuel, mineral resources and metal" increased radically from the end of the 1990s, and the share of "machine, facilities and transportation means" increased from 2002 year within imports. Under this situation, especially since 2002, there has been a major change in the structure of foreign trade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically, the authority to conclude international treaties was exclusively exercised by administrative bodies (or the chief of state). However, recent studies pointed out that the present legislative bodies have come to play a more active role through ratification or the review of treaties in European and American countries. Harrington (2005) studied judicial reform in British dominions and criticized the past executive-dominant treaty-making process as a “democratic deficit” due to a fear that under this system the nation might be bound by international agreements for which a consensus had not been obtained. These studies indicated that people’s participation in the treaty-making process has increased on a global basis, but neither of them provides sufficient descriptive evidence regarding why and how such procedures were established. The present paper therefore attempts to solve these questions by analyzing the legislative and political process of the treaty-making procedure reform in Thailand’s 2007 constitution as a case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Usability is the capability of the software product to be understood, learned, used and attractive to the user, when used under specified conditions. Many studies demonstrate the benefits of usability, yet to this day software products continue to exhibit consistently low levels of this quality attribute. Furthermore, poor usability in software systems contributes largely to software failing in actual use. One of the main disciplines involved in usability is that of Human-Computer Interaction (HCI). Over the past two decades the HCI community has proposed specific features that should be present in applications to improve their usability, yet incorporating them into software continues to be far from trivial for software developers. These difficulties are due to multiple factors, including the high level of abstraction at which these HCI recommendations are made and how far removed they are from actual software implementation. In order to bridge this gap, the Software Engineering community has long proposed software design solutions to help developers include usability features into software, however, the problem remains an open research question. This doctoral thesis addresses the problem of helping software developers include specific usability features into their applications by providing them with a structured and tangible guidance in the form of a process, which we have termed the Usability-Oriented Software Development Process. This process is supported by a set of Software Usability Guidelines that help developers to incorporate a set of eleven usability features with high impact on software design. After developing the Usability-oriented Software Development Process and the Software Usability Guidelines, they have been validated across multiple academic projects and proven to help software developers to include such usability features into their software applications. In doing so, their use significantly reduced development time and improved the quality of the resulting designs of these projects. Furthermore, in this work we propose a software tool to automate the application of the proposed process. In sum, this work contributes to the integration of the Software Engineering and HCI disciplines providing a framework that helps software developers to create usable applications in an efficient way.