963 resultados para TAI-CHI
Resumo:
The Texas Department of Transportation (TxDOT) is concerned about the widening gap between preservation needs and available funding. Funding levels are not adequate to meet the preservation needs of the roadway network; therefore projects listed in the 4-Year Pavement Management Plan must be ranked to determine which projects should be funded now and which can be postponed until a later year. Currently, each district uses locally developed methods to prioritize projects. These ranking methods have relied on less formal qualitative assessments based on engineers’ subjective judgment. It is important for TxDOT to have a 4-Year Pavement Management Plan that uses a transparent, rational project ranking process. The objective of this study is to develop a conceptual framework that describes the development of the 4-Year Pavement Management Plan. It can be largely divided into three Steps; 1) Network-Level project screening process, 2) Project-Level project ranking process, and 3) Economic Analysis. A rational pavement management procedure and a project ranking method accepted by districts and the TxDOT administration will maximize efficiency in budget allocations and will potentially help improve pavement condition. As a part of the implementation of the 4-Year Pavement Management Plan, the Network-Level Project Screening (NLPS) tool including the candidate project identification algorithm and the preliminary project ranking matrix was developed. The NLPS has been used by the Austin District Pavement Engineer (DPE) to evaluate PMIS (Pavement Management Information System) data and to prepare a preliminary list of candidate projects for further evaluation.
Resumo:
The trucking industry has played a significant role in the economic growth in Texas by transporting and distributing commodities using commercial motor vehicles. The Texas Department of Transportation (TxDOT), however, has recognized that the large number of overweight trucks operating on the state highway system has resulted in the deterioration of pavement condition. In addition, the permit fee to carry higher loads above legal limits is much lower than the cost to treat the increase in pavement damage. The primary purpose of the research presented in this paper is to investigate current TxDOT overweight permit structures to support pavement management. The research team analyzed the TxDOT “1547” Over-axle Weight Tolerance Permit structure to support an increase in the fee structure, bringing it more in line with the actual pavement damage. The analysis showed that the revised overweight permit structure could provide an additional $9.3 million annually for pavement maintenance needs by increasing current permit fees. These results were supported by the 2030 Committee for recommendation to the Texas Transportation Commission and consideration by the State Legislature [1]. The research team recommends conducting further research to identify methods for working cooperatively with the trucking industry to develop improved methods for assessing weight damage relationships and developing more effective and accurate means for assessing overweight permit fees.
Resumo:
This paper presents a road survey as part of a workshop conducted by the Texas Department of Transportation (TxDOT) to evaluate and improve the maintenance practices of the Texas highway system. Directors of maintenance from six peer states (California, Kansas, Georgia, Missouri, North Carolina, and Washington) were invited to this 3-day workshop. One of the important parts of this workshop was a Maintenance Test Section Survey (MTSS) to evaluate a number of pre-selected one-mile roadway sections. The workshop schedule allowed half a day to conduct the field survey and 34 sections were evaluated. Each of the evaluators was given a booklet and asked to rate the selected road sections. The goals of the MTSS were to: 1. Assess the threshold level at which maintenance activities are required as perceived by the evaluators from the peer states; 2. Assess the threshold level at which maintenance activities are required as perceived by evaluators from other TxDOT districts; and 3. Perform a pilot evaluation of the MTSS concept. This paper summarizes the information obtained from survey and discusses the major findings based on a statistical analysis of the data and comments from the survey participants.
Resumo:
To assess and improve their practices, and thus ensure the future excellence of the Texas highway system, the Texas Department of Transportation (TxDOT) sought a forum in which experts from other State Departments of Transportation could evaluate the TxDOT maintenance program and practices based on their expertise. To meet this need, a Peer State Review of TxDOT Maintenance Practices project was organized and conducted by the Center for Transportation Research (CTR) at The University of Texas at Austin. CTR researchers, along with TxDOT staff, conducted a workshop to present TxDOT’s maintenance practices to the visiting peer reviewers and invite their feedback. Directors of maintenance from six different states—California, Kansas, Georgia, Missouri, North Carolina, and Washington—participated in the workshop. CTR and TxDOT worked together to design a questionnaire with 15 key questions to capture the peers’ opinions on maintenance program and practices. This paper compiles and summarizes this information. The examination results suggested that TxDOT should use a more state-wide approach to funding and planning, in addition to funding and planning for each district separately. Additionally, the peers recommended that criteria such as condition and level of service of the roadways be given greater weight in the funding allocation than lane miles or vehicle miles traveled (VMT). The Peer Reviewers also determined that TxDOT maintenance employee experience and communications were strong assets. Additional strengths included the willingness of TxDOT to invite peer reviews of their practices and a willingness to consider opportunities for improvement.
Resumo:
In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
Until recently, standards to guide nursing education and practice in Vietnam were nonexistent. This paper describes the development and implementation of a clinical teaching capacity building project piloted in Hanoi, Vietnam. The project was part of a multi-component capacity building program designed to improve nurse education in Vietnam. Objectives of the project were to develop a collaborative clinically-based teaching model that encourages evidence-based, student-centred clinical learning. The model incorporated strategies to promote development of nursing practice to meet national competency standards. Thirty nurse teachers from two organisations in Hanoi participated in the program. These participants attended three workshops, and completed applied assessments, where participants implemented concepts from each workshop. The assessment tasks were planning, implementing and evaluating clinical teaching. On completion of the workshops, twenty participants undertook a study tour in Australia to refine the teaching model and develop an action plan for model implementation in both organisations, with an aim to disseminate the model across Vietnam. Significant changes accredited to this project have been noted on an individual and organisational level. Dissemination of this clinical teaching model has commenced in Ho Chi Minh, with further plans for more in-depth dissemination to occur throughout the country.
Resumo:
The goal of this research project is to develop specific BIM objects for temporary construction activities which are fully integrated with object design, construction efficiency and safety parameters. Specifically, the project will deliver modularised electronic scaffolding and formwork objects that will allow designers to easily incorporate them into BIM models to facilitate smarter and safer infrastructure and building construction. This research first identified there is currently a distinct lack of BIM objects for temporary construction works resulting in productivity loss during design and construction, and opportunities for improved consideration of safety standards and practices with the design of scaffolding and formwork. This is particularly relevant in Australia, given the “harmonisation” of OHS legislation across all states and territories from 1 January 2012, meaning that enhancements to Queensland practices will have direct application across Australia. Thus, in conjunction with government and industry partners in Queensland, Australia, the research team developed a strategic three-phase research methodology: (1) the preliminary review phase on industrial scaffolding and formwork practices and BIM implementation; (2) the BIM object development phase with specific safety and productivity functions; and (3) the Queensland-wide workshop phase for product dissemination and training. This paper discusses background review findings, details of the developed methodology, and expected research outcomes and their contributions to the Australian construction industry.
Resumo:
The regulatory pathways involved in maintaining the pluripotency of embryonic stem cells are partially known, whereas the regulatory pathways governing adult stem cells and their "stem-ness" are characterized to an even lesser extent. We, therefore, screened the transcriptome profiles of 20 osteogenically induced adult human adipose-derived stem cell (ADSC) populations and investigated for putative transcription factors that could regulate the osteogenic differentiation of these ADSC. We studied a subgroup of donors' samples that had a disparate osteogenic response transcriptome from that of induced human fetal osteoblasts and the rest of the induced human ADSC samples. From our statistical analysis, we found activating transcription factor 5 (ATF5) to be significantly and consistently down-regulated in a randomized time-course study of osteogenically differentiated adipose-derived stem cells from human donor samples. Knockdown of ATF5 with siRNA showed an increased sensitivity to osteogenic induction. This evidence suggests a role for ATF5 in the regulation of osteogenic differentiation in adipose-derived stem cells. To our knowledge, this is the first report that indicates a novel role of transcription factors in regulating osteogenic differentiation in adult or tissue specific stem cells. © 2012 Wiley Periodicals, Inc.
Resumo:
Enterprise Systems (ES) have emerged as possibly the most important and challenging development in the corporate use of information technology in the last decade. Organizations have invested heavily in these large, integrated application software suites expecting improvments in; business processes, management of expenditure, customer service, and more generally, competitiveness, improved access to better information/knowledge (i.e., business intelligence and analytics). Forrester survey data consistently shows that investment in ES and enterprise applications in general remains the top IT spending priority, with the ES market estimated at $38 billion and predicted to grow at a steady rate of 6.9%, reaching $50 billion by 2012 (Wang & Hamerman, 2008). Yet, organizations have failed to realize all the anticipated benefits. One of the key reasons is the inability of employees to properly utilize the capabilities of the enterprise systems to complete the work and extract information critical to decision making. In response, universities (tertiary institutes) have developed academic programs aimed at addressing the skill gaps. In parallel with the proliferation of ES, there has been growing recognition of the importance of Teaching Enterprise Systems at tertiary education institutes. Many academic papers have discused the important role of Enterprise System curricula at tertiary education institutes (Ask, 2008; Hawking, 2004; Stewart, 2001), where the teaching philosophises, teaching approaches and challenges in Enterprise Systems education were discussed. Following the global trends, tertiary institutes in the Pacific-Asian region commenced introducing Enterprise System curricula in late 1990s with a range of subjects (a subject represents a single unit, rather than a collection of units; which we refer to as a course) in faculties / schools / departments of Information Technology, Business and in some cases in Engineering. Many tertiary educations commenced their initial subject offers around four salient concepts of Enterprise Systems: (1) Enterprise Systems implementations, (2) Introductions to core modules of Enterprise Systems, (3) Application customization using a programming language (e.g. ABAP) and (4) Systems Administration. While universities have come a long way in developing curricula in the enterprise system area, many obstacles remain: high cost of technology, qualified faculty to teach, lack of teaching materials, etc.
Resumo:
Data mining techniques extract repeated and useful patterns from a large data set that in turn are utilized to predict the outcome of future events. The main purpose of the research presented in this paper is to investigate data mining strategies and develop an efficient framework for multi-attribute project information analysis to predict the performance of construction projects. The research team first reviewed existing data mining algorithms, applied them to systematically analyze a large project data set collected by the survey, and finally proposed a data-mining-based decision support framework for project performance prediction. To evaluate the potential of the framework, a case study was conducted using data collected from 139 capital projects and analyzed the relationship between use of information technology and project cost performance. The study results showed that the proposed framework has potential to promote fast, easy to use, interpretable, and accurate project data analysis.
Resumo:
IT-supported field data management benefits on-site construction management by improving accessibility to the information and promoting efficient communication between project team members. However, most of on-site safety inspections still heavily rely on subjective judgment and manual reporting processes and thus observers’ experiences often determine the quality of risk identification and control. This study aims to develop a methodology to efficiently retrieve safety-related information so that the safety inspectors can easily access to the relevant site safety information for safer decision making. The proposed methodology consists of three stages: (1) development of a comprehensive safety database which contains information of risk factors, accident types, impact of accidents and safety regulations; (2) identification of relationships among different risk factors based on statistical analysis methods; and (3) user-specified information retrieval using data mining techniques for safety management. This paper presents an overall methodology and preliminary results of the first stage research conducted with 101 accident investigation reports.
Resumo:
Interleukin(IL)-18 is a pleiotrophic cytokine with functions in immune modulation, angiogenesis and bone metabolism. In this study, the potential of IL-18 as an immunotherapy for prostate cancer (PCa) was examined using the murine model of prostate carcinoma, RM1 and a bone metastatic variant RM1(BM)/B4H7-luc. RM1 and RM1(BM)/B4H7-luc cells were stably transfected to express bioactive IL-18. These cells were implanted into syngeneic immunocompetent mice, with or without an IL-18-neutralising antibody (αIL-18, SK113AE4). IL-18 significantly inhibited the growth of both subcutaneous and orthotopic RM1 tumors and the IL-18 neutralizing antibody abrogated the tumor growth-inhibition. In vivo neutralization of interferon-gamma (IFN-γ) completely eliminated the anti-tumor effects of IL-18 confirming an essential role of IFN-γ as a down-stream mediator of the anti-tumor activity of IL-18. Tumors from mice in which IL-18 and/or IFN-γ was neutralized contained significantly fewer CD4+ and CD8+ T cells than those with functional IL-18. The essential role of adaptive immunity was demonstrated as tumors grew more rapidly in RAG1−/− mice or in mice depleted of CD4+ and/or CD8+ cells than in normal mice. The tumors in RAG1−/− mice were also significantly smaller when IL-18 was present, indicating that innate immune mechanisms are involved. IL-18 also induced an increase in tumor infiltration of macrophages and neutrophils but not NK cells. In other experiments, direct injection of recombinant IL-18 into established tumors also inhibited tumor growth, which was associated with an increase in intratumoral macrophages, but not T cells. These results suggest that local IL-18 in the tumor environment can significantly potentiate anti-tumor immunity in the prostate and clearly demonstrate that this effect is mediated by innate and adaptive immune mechanisms.
Resumo:
This study is motivated by, and proceeds from, a central interest in the importance of evaluating IS service quality and adopts the IS ZOT SERVQUAL instrument (Kettinger & Lee, 2005) as its core theory base. This study conceptualises IS service quality as a multidimensional formative construct and seeks to answer the main research questions: “Is the IS service quality construct valid as a 1st-order formative, 2nd-order formative multidimensional construct?” Additionally, with the aim of validating the IS service quality construct within its nomological net, as in prior service marketing work, Satisfaction was hypothesised as its immediate consequence. With the goal of testing the above research question, IS service quality and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS), employing 219 valid responses, largely evidenced the validity of IS service quality as a multidimensional formative construct. The nomological validity of the IS service quality construct was also evidenced by demonstrating that 55% of Satisfaction was explained by the multidimensional formative IS service quality construct.