932 resultados para Modified Direct Analysis Method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite widespread recognition of the problem of adolescent alcohol and other drug (AOD) abuse, research on its most common treatment modality, group work, is lacking. This research gap is alarming given that outcomes range from positive to potentially iatrogenic. This study sought to identify change mechanisms and/or treatment factors that are observable within group treatment sessions and that may predict AOD use outcomes. This NIH (F31 DA 020233-01A1) study evaluated 108, 10-19 year olds and the 19 school-based treatment groups to which they were previously assigned (R01 AA10246; PI: Wagner). Associations between motivational interviewing (MI) based change talk variables, group leader MI skills, and alcohol and marijuana use outcomes up to 12-months following treatment were evaluated. Treatment session audio recordings and transcripts (1R21AA015679-01; PI: Macgowan) were coded using a new discourse analysis coding scheme for measuring group member change talk (Amrhein, 2003). Therapist MI skills were similarly measured using the Motivational Interviewing Treatment Integrity instrument. Group member responses to commitment predicted group marijuana use at the 1-month follow up. Also, group leader empathy was significantly associated with group commitment for marijuana use at the middle and ending stages of treatment. Both of the above process measures were applied in a group setting for the first time. Building upon MI and social learning theory principles, group commitment and group member responses to commitment are new observable, in-session, process constructs that may predict positive and negative adolescent group treatment outcomes. These constructs, as well as the discourse analysis method and instruments used to measure them, raise many possibilities for future group work process research and practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Airborne Light Detection and Ranging (LIDAR) technology has become the primary method to derive high-resolution Digital Terrain Models (DTMs), which are essential for studying Earth's surface processes, such as flooding and landslides. The critical step in generating a DTM is to separate ground and non-ground measurements in a voluminous point LIDAR dataset, using a filter, because the DTM is created by interpolating ground points. As one of widely used filtering methods, the progressive morphological (PM) filter has the advantages of classifying the LIDAR data at the point level, a linear computational complexity, and preserving the geometric shapes of terrain features. The filter works well in an urban setting with a gentle slope and a mixture of vegetation and buildings. However, the PM filter often removes ground measurements incorrectly at the topographic high area, along with large sizes of non-ground objects, because it uses a constant threshold slope, resulting in "cut-off" errors. A novel cluster analysis method was developed in this study and incorporated into the PM filter to prevent the removal of the ground measurements at topographic highs. Furthermore, to obtain the optimal filtering results for an area with undulating terrain, a trend analysis method was developed to adaptively estimate the slope-related thresholds of the PM filter based on changes of topographic slopes and the characteristics of non-terrain objects. The comparison of the PM and generalized adaptive PM (GAPM) filters for selected study areas indicates that the GAPM filter preserves the most "cut-off" points removed incorrectly by the PM filter. The application of the GAPM filter to seven ISPRS benchmark datasets shows that the GAPM filter reduces the filtering error by 20% on average, compared with the method used by the popular commercial software TerraScan. The combination of the cluster method, adaptive trend analysis, and the PM filter allows users without much experience in processing LIDAR data to effectively and efficiently identify ground measurements for the complex terrains in a large LIDAR data set. The GAPM filter is highly automatic and requires little human input. Therefore, it can significantly reduce the effort of manually processing voluminous LIDAR measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of the man with the hospitalized child is still insignificant and the relationships established in hospitals culminate in several situations that can to influence his experience. The study aimed to analyze the experiences of the parent / caregiver during the hospitalization of their child. In intention to develop the research, it was conducted an exploratory and descriptive qualitative research approach, developed with 11 fathers who accompanied sick childen at the Paediatric Hospital in metropolitan area of Natal, Rio Grande do Norte, Brazil. As inclusion criteria men should be aged 18 years; have favorable emotional conditions to answer the questions are be accompanying his child aged between one to five years old in clinical or surgical. Data collection occurred in March and April 2014, using an interview script. This step prior to the approval of the Health Department of state of Rio Grande do Norte, approved by Universidade Federal do Rio Grande do Norte Committee on Ethics in Research by Certificate of Presentation and Ethics Consideration No. 22821513.1.0000.5537. The data treatment occurred following the content analysis method in thematic modality proposed by Bardin. According to statements the following categories emerged: "The presence of the father in the hospitalization of a child" and "Responsibilities and parental attitudes the hospitalization of a child”, which were analyzed and discussed based on the literature on the family in the hospitalization of the child and considerations about the care of the child. It was founded that the respondents the experienced institutionalization son were inserted in a context of active participation of tasks and sharing responsibilities. Thus it was considered in the study the need of enforcing rights of the father as a family entity in practice of them child care instead of social and gender issues that are still strongly rooted in contemporary society. Given this, it is necessary that the nursing staff consider the various situations faced by man during infant hospitalization with the first fruits of this approach to the care of the son process minimizing the sequelae stemming from being away from the family nucleus

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rapid development in industry have contributed to more complex systems that are prone to failure. In applications where the presence of faults may lead to premature failure, fault detection and diagnostics tools are often implemented. The goal of this research is to improve the diagnostic ability of existing FDD methods. Kernel Principal Component Analysis has good fault detection capability, however it can only detect the fault and identify few variables that have contribution on occurrence of fault and thus not precise in diagnosing. Hence, KPCA was used to detect abnormal events and the most contributed variables were taken out for more analysis in diagnosis phase. The diagnosis phase was done in both qualitative and quantitative manner. In qualitative mode, a networked-base causality analysis method was developed to show the causal effect between the most contributing variables in occurrence of the fault. In order to have more quantitative diagnosis, a Bayesian network was constructed to analyze the problem in probabilistic perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do grau de mestre no âmbito do Mestrado em Educação Social e Intervenção Comunitária da Escola Superior de Educação do Instituto Politécnico de Santarém

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract

The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.

This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.

I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.

Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.

II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.

The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.

In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Far-field stresses are those present in a volume of rock prior to excavations being created. Estimates of the orientation and magnitude of far-field stresses, often used in mine design, are generally obtained by single-point measurements of stress, or large-scale, regional trends. Point measurements can be a poor representation of far-field stresses as a result of excavation-induced stresses and geological structures. For these reasons, far-field stress estimates can be associated with high levels of uncertainty. The purpose of this thesis is to investigate the practical feasibility, applications, and limitations of calibrating far-field stress estimates through tunnel deformation measurements captured using LiDAR imaging. A method that estimates the orientation and magnitude of excavation-induced principal stress changes through back-analysis of deformation measurements from LiDAR imaged tunnels was developed and tested using synthetic data. If excavation-induced stress change orientations and magnitudes can be accurately estimated, they can be used in the calibration of far-field stress input to numerical models. LiDAR point clouds have been proven to have a number of underground applications, thus it is desired to explore their use in numerical model calibration. The back-analysis method is founded on the superposition of stresses and requires a two-dimensional numerical model of the deforming tunnel. Principal stress changes of known orientation and magnitude are applied to the model to create calibration curves. Estimation can then be performed by minimizing squared differences between the measured tunnel and sets of calibration curve deformations. In addition to the back-analysis estimation method, a procedure consisting of previously existing techniques to measure tunnel deformation using LiDAR imaging was documented. Under ideal conditions, the back-analysis method estimated principal stress change orientations within ±5° and magnitudes within ±2 MPa. Results were comparable for four different tunnel profile shapes. Preliminary testing using plastic deformation, a rough tunnel profile, and profile occlusions suggests that the method can work under more realistic conditions. The results from this thesis set the groundwork for the continued development of a new, inexpensive, and efficient far-field stress estimate calibration method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explains how the practice of integrating ecosystem-service thinking (i.e., ecological benefits for human beings) and institutions (i.e., organisations, policy rules) is essential for coastal spatial planning. Adopting an integrated perspective on ecosystem services (ESs) both helps understand a wide range of possible services and, at the same time, attune institution to local resource patterns. The objective of this paper is to identify the extent to which ESs are integrated in a specific coastal strategic planning case. A subsequent objective is to understand whether institutions are capable of managing ESs in terms of uncovering institutional strengths and weaknesses that may exist in taking ESs into account in existing institutional practices. These two questions are addressed through the application of a content analysis method and a multi-level analysis framework on formal institutions. Jiaozhou Bay in China is used as an illustrative case. The results show that some ESs have been implicitly acknowledged, but by no means the whole range. This partial ES implementation could result from any of four institutional weaknesses in the strategic plans of Jiaozhou Bay, namely a dominant market oriented interest, fragmented institutional structures for managing ESs, limited ES assessment, and a lack of integrated reflection of the social value of ESs in decision-making. Finally, generalizations of multi-level institutional settings on ES integration, such as an inter-organisational fragmentation and a limited use of ES assessment in operation, are made together with other international case studies. Meanwhile, the comparison highlights the influences of extensive market-oriented incentives and governments' exclusive responsibilities on ES governance in the Chinese context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This essay aims to explore the issue of methodologic perspectivism in Kosselleck’s thought considering as cornerstone the concept of temporalization. The first section links the concepts of temporalization and secularization introducing beforehand philosophies of the history from Sattelzeit time. Then the text focuses on reconstructing the notion of temporalization based on an emerging tension between the language and the reality it describes. This article concludes bringing up the notion of ficcionality as a key element in Koselleck’s theory of history making up for the methodological deficits after this tension. The unifying thread of this essay is that the theoretical project of a conceptual history it is not only an analysis method but mainly a theory of modernity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to explore the role and activities of nurse practitioners (NPs) working in long-term care (LTC) to understand concepts of access to primary care for residents. Utilizing the "FIT" framework developed by Penchanksy and Thomas, we used a directed content analysis method to analyze data from a pan-Canadian study of NPs in LTC. Individual and focus group interviews were conducted at four sites in western, central and eastern regions of Canada with 143 participants, including NPs, RNs, regulated and unregulated nursing staff, allied health professionals, physicians, administrators and directors and residents and family members. Participants emphasized how the availability and accessibility of the NP had an impact on access to primary and urgent care for residents. Understanding more about how NPs affect access in Canadian LTC will be valuable for nursing practice and healthcare planning and policy and may assist other countries in planning for the introduction of NPs in LTC settings to increase access to primary care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research verifies the influence of the self-efficacy level on burnout syndrome incidence in relation to nursing professionals from private hospitals located in the Municipality of Natal, State of Rio Grande do Norte. The nature of the research was descriptive, and the used data analysis method was quantitative which was developed through SPSS computational package, version 17.0. The used instrument for the investigation was Maslach-Burnout Inventory (MBI), and the General Perceived Self-efficacy Scale (GPSES) was applied to a sample formed from 230 nursing professionals. The statistic techniques to data analysis were: frequency analysis; factor analysis; Cronbach.s alpha; Kaiser-Meyer-Olkin test (KMO); Bartlett efericity test; percentual analysis; Spearman rank correlation analysis; and simple regression. The achieved factors from factor analysis of MBI were the same, taking into account the dimensions which Maslach initially suggested to the instrument (emotional exhaustion, lack of personal realization, and depersonalization). However, one highlights that the low internal consistence of the depersonalization dimension can occur from people.s difficulty (caused by cultural aspects) of assuming this attitude in their work environment. Through GSE, it was achieved a factor which confirmed the unidimensionality showed by the author of the instrument. In relation to the syndrome incidence, it was verified that about 50% of the researched sample presented burnout syndrome evidence. Referring to self-efficacy level, about 65% of the researched sample presented low level of self-efficacy, what can be explained by the work characteristics of these professionals. In relation to the self-efficacy influence on the Burnout syndrome, it was verified that self-efficacy can be one of the aspects which influences occupational stress chronification (burnout), mainly to the personal realization dimension. Therefore, the researched hospital organizations need reflect about their attitudes in respect to their professionals, since the numbers showed a dangerous tendency regarding a predisposition to burnout syndrome of their staff, what implies not only a significant amount of individuals who can present high levels of emotional exhaustion, lack of personal realization, and depersonalization, but also the fact that this group presents low level of self-efficacy

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Portland cement concrete (PCC) pavement undergoes repeated environmental load-related deflection resulting from temperature and moisture variations across the pavement depth. This phenomenon, referred to as PCC pavement curling and warping, has been known and studied since the mid-1920s. Slab curvature can be further magnified under repeated traffic loads and may ultimately lead to fatigue failures, including top-down and bottom-up transverse, longitudinal, and corner cracking. It is therefore important to measure the “true” degree of curling and warping in PCC pavements, not only for quality control (QC) and quality assurance (QA) purposes, but also to achieve a better understanding of its relationship to long-term pavement performance. In order to better understand the curling and warping behavior of PCC pavements in Iowa and provide recommendations to mitigate curling and warping deflections, field investigations were performed at six existing sites during the late fall of 2015. These sites included PCC pavements with various ages, slab shapes, mix design aspects, and environmental conditions during construction. A stationary light detection and ranging (LiDAR) device was used to scan the slab surfaces. The degree of curling and warping along the longitudinal, transverse, and diagonal directions was calculated for the selected slabs based on the point clouds acquired using LiDAR. The results and findings are correlated to variations in pavement performance, mix design, pavement design, and construction details at each site. Recommendations regarding how to minimize curling and warping are provided based on a literature review and this field study. Some examples of using point cloud data to build three-dimensional (3D) models of the overall curvature of the slab shape are presented to show the feasibility of using this 3D analysis method for curling and warping analysis.