953 resultados para Nutritional state assessment
Resumo:
The National Curricular Guidelines for nutrition course predict training directed to the work in the Unified Health System (SUS) and the mandatory completion of course work. The aim of this study was to analyze the formation of dieticians in Rio Grande do Norte state/Brazil to work in the SUS, from the compulsory scientific production provided for training. This is a bibliometric study, quantitative approach, performed with completion of course works of five nutrition undergraduate courses in Rio Grande do Norte state in the years 2013/2013. From the reading of the works, the following variables were collected: nature of the institution, institution administrative category, location, work title, number of authors, work format, titration of the teacher advisor, study type, area of interest, scenery of accomplishment, submission to the ethics committee, suitability of descriptors and, as pertaining to the field of public health, the subfield of public health and the theme. The pedagogical projects of the courses were read and were identified opportunities to develop research in the graduation. For detection of significant categories we applied the Pearson chisquare test. We analyzed 195 works, coming mostly belonging to universities courses (79.0%) and private institutions (56.4%). A higher frequency of articles (68.2%), developed by one student (65.6%), guided by master teachers (57.9%), with cross-sectional study design (49.2%), conducted in laboratory (25%) and without submission to the ethics committee in research (49.2%) was find. The median adequacy of descriptors was 50%. As for the interest of the study, there was a higher frequency of work in the field of public health (p <0.001), within this subfield highlighting the nutritional epidemiology (63.0%) (p <0.001) and the subject nutrition assessment (57, 4%) (p <0.001). In cut on three major areas of dietician performance, was significant performing work in the field of public health in public institutions (p <0.05). The presence of complementary activities was unanimous in the pedagogical projects of the courses. The results of the study showed some methodological weaknesses in the research approaches, as well as a hegemonic positivist training. Despite the emphasis on public health, it was noticed little approximation of policies and nutrition programs in the context of the mandatory recearch of Rio Grande do Norte state nutrition courses.
Resumo:
This thesis investigated the risk of accidental release of hydrocarbons during transportation and storage. Transportation of hydrocarbons from an offshore platform to processing units through subsea pipelines involves risk of release due to pipeline leakage resulting from corrosion, plastic deformation caused by seabed shakedown or damaged by contact with drifting iceberg. The environmental impacts of hydrocarbon dispersion can be severe. Overall safety and economic concerns of pipeline leakage at subsea environment are immense. A large leak can be detected by employing conventional technology such as, radar, intelligent pigging or chemical tracer but in a remote location like subsea or arctic, a small chronic leak may be undetected for a period of time. In case of storage, an accidental release of hydrocarbon from the storage tank could lead pool fire; further it could escalate to domino effects. This chain of accidents may lead to extremely severe consequences. Analyzing past accident scenarios it is observed that more than half of the industrial domino accidents involved fire as a primary event, and some other factors for instance, wind speed and direction, fuel type and engulfment of the compound. In this thesis, a computational fluid dynamics (CFD) approach is taken to model the subsea pipeline leak and the pool fire from a storage tank. A commercial software package ANSYS FLUENT Workbench 15 is used to model the subsea pipeline leakage. The CFD simulation results of four different types of fluids showed that the static pressure and pressure gradient along the axial length of the pipeline have a sharp signature variation near the leak orifice at steady state condition. Transient simulation is performed to obtain the acoustic signature of the pipe near leak orifice. The power spectral density (PSD) of acoustic signal is strong near the leak orifice and it dissipates as the distance and orientation from the leak orifice increase. The high-pressure fluid flow generates more noise than the low-pressure fluid flow. In order to model the pool fire from the storage tank, ANSYS CFX Workbench 14 is used. The CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. The attempt to reduce and prevent risks is discussed based on the results obtained from the numerical simulations of the numerical models.
Resumo:
Acknowledgements We acknowledge the IAN-AF team (in particular to Duarte Torres, Milton Severo and Andreia Oliveira) for the community sampling and their support on dietary assessment methodology and critical discussion along the elaboration of the present protocol. Funding This project (136SI5) was granted by the Public Health Initiatives Programme (PT06), financed by EEA Grants Financial Mechanism 2009-2014.
Resumo:
Date of Acceptance: 08/05/2014 Acknowledgements The authors are indebted to Julia Römer for assisting with editing several hundred references. Helmut Haberl gratefully acknowledges funding by the Austrian Academy of Sciences (Global Change Programme), the Austrian Ministry of Science and Research (BMWF, proVision programme) as well as by the EU-FP7 project VOLANTE. Carmenza Robledo-Abad received financial support from the Swiss State Secretariat for Economic Affairs.
Resumo:
Date of Acceptance: 08/05/2014 Acknowledgements The authors are indebted to Julia Römer for assisting with editing several hundred references. Helmut Haberl gratefully acknowledges funding by the Austrian Academy of Sciences (Global Change Programme), the Austrian Ministry of Science and Research (BMWF, proVision programme) as well as by the EU-FP7 project VOLANTE. Carmenza Robledo-Abad received financial support from the Swiss State Secretariat for Economic Affairs.
Resumo:
Laser trackers have been widely used in many industries to meet increasingly high accuracy requirements. In laser tracker measurement, it is complex and difficult to perform an accurate error analysis and uncertainty evaluation. This paper firstly reviews the working principle of single beam laser trackers and state-of- The- Art of key technologies from both industrial and academic efforts, followed by a comprehensive analysis of uncertainty sources. A generic laser tracker modelling method is formulated and the framework of the virtual tracker is proposed. The VLS can be used for measurement planning, measurement accuracy optimization and uncertainty evaluation. The completed virtual laser tracking system should take all the uncertainty sources affecting coordinate measurement into consideration and establish an uncertainty model which will behave in an identical way to the real system. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.
A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.
Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.
The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).
First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.
Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.
Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.
The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.
To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.
The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.
The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.
Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.
The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.
In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.
Resumo:
Computed tomography (CT) is a valuable technology to the healthcare enterprise as evidenced by the more than 70 million CT exams performed every year. As a result, CT has become the largest contributor to population doses amongst all medical imaging modalities that utilize man-made ionizing radiation. Acknowledging the fact that ionizing radiation poses a health risk, there exists the need to strike a balance between diagnostic benefit and radiation dose. Thus, to ensure that CT scanners are optimally used in the clinic, an understanding and characterization of image quality and radiation dose are essential.
The state-of-the-art in both image quality characterization and radiation dose estimation in CT are dependent on phantom based measurements reflective of systems and protocols. For image quality characterization, measurements are performed on inserts imbedded in static phantoms and the results are ascribed to clinical CT images. However, the key objective for image quality assessment should be its quantification in clinical images; that is the only characterization of image quality that clinically matters as it is most directly related to the actual quality of clinical images. Moreover, for dose estimation, phantom based dose metrics, such as CT dose index (CTDI) and size specific dose estimates (SSDE), are measured by the scanner and referenced as an indicator for radiation exposure. However, CTDI and SSDE are surrogates for dose, rather than dose per-se.
Currently there are several software packages that track the CTDI and SSDE associated with individual CT examinations. This is primarily the result of two causes. The first is due to bureaucracies and governments pressuring clinics and hospitals to monitor the radiation exposure to individuals in our society. The second is due to the personal concerns of patients who are curious about the health risks associated with the ionizing radiation exposure they receive as a result of their diagnostic procedures.
An idea that resonates with clinical imaging physicists is that patients come to the clinic to acquire quality images so they can receive a proper diagnosis, not to be exposed to ionizing radiation. Thus, while it is important to monitor the dose to patients undergoing CT examinations, it is equally, if not more important to monitor the image quality of the clinical images generated by the CT scanners throughout the hospital.
The purposes of the work presented in this thesis are threefold: (1) to develop and validate a fully automated technique to measure spatial resolution in clinical CT images, (2) to develop and validate a fully automated technique to measure image contrast in clinical CT images, and (3) to develop a fully automated technique to estimate radiation dose (not surrogates for dose) from a variety of clinical CT protocols.
Resumo:
This thesis focuses on tectonic geomorphology and the response of the Ken River catchment to postulated tectonic forcing along a NE-striking monocline fold in the Panna region, Madhya Pradesh, India. Peninsular India is underlain by three northeast-trending paleotopographic ridges of Precambrian Indian basement, bounded by crustal-scale faults. Of particular interest is the Pokhara lineament, a crustal scale fault that defines the eastern edge of the Faizabad ridge, a paleotopographic high cored by the Archean Bundelkhand craton. The Pokhara lineament coincides with the monocline structure developed in the Proterozoic Vindhyan Supergroup rocks along the Bundelkhand cratonic margin. A peculiar, deeply incised meander-like feature, preserved along the Ken River where it flows through the monocline, may be intimately related to the tectonic regime of this system. This thesis examines 41 longitudinal stream profiles across the length of the monocline structure to identify any tectonic signals generated from recent surface uplift above the Pokhara lineament. It also investigates the evolution of the Ken River catchment in response to the generation of the monocline fold. Digital Elevation Models (DEM) from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) were used to delineate a series of tributary watersheds and extract individual stream profiles which were imported into MATLAB for analysis. Regression limits were chosen to define distinct channel segments, and knickpoints were defined at breaks between channel segments where there was a discrete change in the steepness of the channel profile. The longitudinal channel profiles exhibit the characteristics of a fluvial system in transient state. There is a significant downstream increase in normalized steepness index in the channel profiles, as well as a general increase in concavity downstream, with some channels exhibiting convex, over-steepened segments. Normalized steepness indices and uppermost knickpoint elevations are on average much higher in streams along the southwest segment of the monocline compared to streams along the northeast segment. Most channel profiles have two to three knickpoints, predominantly exhibiting slope-break morphology. These data have important implications for recent surface uplift above the Pokhara lineament. Furthermore, geomorphic features preserved along the Ken River suggest that it is an antecedent river. The incised meander-like feature appears to be the abandoned river valley of a former Ken River course that was captured during the evolution of the landscape by what is the present day Ken River.
Resumo:
This thesis explores the effects of rehabilitation on the structural performance of corrugated steel culverts. A full-scale laboratory experiment investigated the effects of grouted slip-liners on the performance of two buried circular corrugated steel culverts. One culvert was slip-lined and grouted using low strength grout, while the other was slip-lined and grouted using high strength grout. The performances of the culverts were measured before and after rehabilitation under service loads using single wheel pair loading at 0.45m of cover. Then, the rehabilitated culverts were loaded to their ultimate limit states. Results showed that the low and high strength grouted slip-liners provided strength well beyond requirements, with the low strength specimen failing at a load 2.4 times the fully factored service load, while the high strength specimen did not reach an ultimate limit state before bearing failure of the soil stopped testing. Results also showed that the low strength specimen behaved rigidly under service loads and flexibly under higher loads, while the high strength specimen behaved rigidly under all loads. A second full-scale experiment investigated the effect of a paved invert rehabilitation procedure on the performance of a deteriorated horizontal ellipse culvert. The performance of the culvert before and after rehabilitation was examined under service loads using tandem axle loading at 0.45m of cover. The rehabilitated culvert was then loaded up to its ultimate limit state. The culvert failed due to the formation of a plastic hinge at the West shoulder, while the paved invert cracked at the invert. Results showed that the rehabilitation increased the structural performance of the culvert, increasing the system stiffness and reducing average strains and local bending at critical locations in the culvert under service loads. A sustainability rating tool specifically for the evaluation of deteriorated culvert replacement or rehabilitation projects was also developed. A module for an existing tool, called GoldSET, was created and tested using two case studies, each comparing the replacement of a culvert using a traditional open-cut method with two trenchless rehabilitation techniques. In each case, the analyses showed that the trenchless techniques were the better alternatives in terms of sustainability.
Resumo:
Currently, there is increasing use of nanomaterials in the food industry thanks to the many advantages offered and make the products that contain them more competitive in the market. Their physicochemical properties often differ from those of bulk materials, which require specialized risk assessment. This should cover the risks to the health of workers and consumers as well as possible environmental risks. The risk assessment methods must go updating due to more widespread use of nanomaterials, especially now that are making their way down to consumer products. Today there is no specific legislation for nanomaterials, but there are several european dispositions and regulations that include them. This review gives an overview of the risk assessment and the existing current legislation regarding the use of nanotechnology in the food industry.
Resumo:
Steady-state computational fluid dynamics (CFD) simulations are an essential tool in the design process of centrifugal compressors. Whilst global parameters, such as pressure ratio and efficiency, can be predicted with reasonable accuracy, the accurate prediction of detailed compressor flow fields is a much more significant challenge. Much of the inaccuracy is associated with the incorrect selection of turbulence model. The need for a quick turnaround in simulations during the design optimisation process, also demands that the turbulence model selected be robust and numerically stable with short simulation times.
In order to assess the accuracy of a number of turbulence model predictions, the current study used an exemplar open CFD test case, the centrifugal compressor ‘Radiver’, to compare the results of three eddy viscosity models and two Reynolds stress type models. The turbulence models investigated in this study were (i) Spalart-Allmaras (SA) model, (ii) the Shear Stress Transport (SST) model, (iii) a modification to the SST model denoted the SST-curvature correction (SST-CC), (iv) Reynolds stress model of Speziale, Sarkar and Gatski (RSM-SSG), and (v) the turbulence frequency formulated Reynolds stress model (RSM-ω). Each was found to be in good agreement with the experiments (below 2% discrepancy), with respect to total-to-total parameters at three different operating conditions. However, for the off-design conditions, local flow field differences were observed between the models, with the SA model showing particularly poor prediction of local flow structures. The SST-CC showed better prediction of curved rotating flows in the impeller. The RSM-ω was better for the wake and separated flow in the diffuser. The SST model showed reasonably stable, robust and time efficient capability to predict global and local flow features.
Resumo:
The Highway Safety Manual (HSM) is the compilation of national safety research that provides quantitative methods for analyzing highway safety. The HSM presents crash modification functions related to freeway work zone characteristics such as work zone duration and length. These crash modification functions were based on freeway work zones with high traffic volumes in California. When the HSM-referenced model was calibrated for Missouri, the value was 3.78, which is not ideal since it is significantly larger than 1. Therefore, new models were developed in this study using Missouri data to capture geographical, driver behavior, and other factors in the Midwest. Also, new models for expressway and rural two-lane work zones that barely were studied in the literature were developed. A large sample of 20,837 freeway, 8,993 expressway, and 64,476 rural two-lane work zones in Missouri was analyzed to derive 15 work zone crash prediction models. The most appropriate samples of 1,546 freeway, 1,189 expressway, and 6,095 rural two-lane work zones longer than 0.1 mile and with a duration of greater than 10 days were used to make eight, four, and three models, respectively. A challenging question for practitioners is always how to use crash prediction models to make the best estimation of work zone crash count. To solve this problem, a user-friendly software tool was developed in a spreadsheet format to predict work zone crashes based on work zone characteristics. This software selects the best model, estimates the work zone crashes by severity, and converts them to monetary values using standard crash estimates. This study also included a survey of departments of transportation (DOTs), Federal Highway Administration (FHWA) representatives, and contractors to assess the current state of the practice regarding work zone safety. The survey results indicate that many agencies look at work zone safety informally using engineering judgment. Respondents indicated that they would like a tool that could help them to balance work zone safety across projects by looking at crashes and user costs.
Resumo:
The forces surrounding the emerging economies of underdeveloped world, especially Africa has practically stifled its economic progress, growth, development and sustainability. This economic condition brings to the fore the massive onslaught of rural/urban poverty which the African continent grapples with since the post-world war II era to date. The economic misfortunes and incidence of mass poverty in Africa, vis-à-vis Nigeria is used as a point of departure in this study. The paper underscores the ideological and philosophical undertone of international capital manifesting in form of colonialism and imperialism as a major character in the historical process of underdevelopment and mass poverty in peripheral states of Africa, Asia and Latin America, respectively. Of particular interest in this study is the activities of domestic bourgeoisie elite class who have vigorously displayed some degree of lack of much needed vision and abject lack of desires to draw up workable plans to redeem the battered image of African/ Nigerian economic misfortunes. This state of affairs has practically engendered economic underdevelopment, misery and disturbing levels of poverty in the nation-state system. The paper concludes with the forward towards realizing the vision 20-20-20 objectives in the 21t century and beyond.
Resumo:
This report is the product of a statewide needs assessment/community input process. It is a follow up to the State Plan for Substance Abuse Prevention developed in 1998.